Drone Telemetry Anomaly Detection with Self‑Supervised JEPA

Detect tampered drone flights from telemetry alone—no anomaly labels required for training.

Course: TAMU CSCE 625 — Artificial Intelligence (Spring 2026) Team: Yaswanth Reddy Yaradoddi · Siva Sai Deepank Manoj · Ubaid Khan Mohammed GitHub: https://github.com/yaswanthreddyyyr/Drone-Anomaly-Detection-JEPA


TL;DR

Drone telemetry streams (GPS/altitude/speed/heading) can be spoofed or manipulated mid‑flight. We built a self‑supervised JEPA encoder that learns normal flight dynamics from normal logs only, then uses a Local Outlier Factor (LOF) detector on learned embeddings to flag suspicious chunks at inference time.

Best configuration (JEPA v3 + PCA + LOF‑Manhattan): AUC 0.7719, F1 0.685, composite Score 2.60 (+35.9% vs. our first JEPA setup).


1) Why this problem matters

Modern drone applications—delivery, agriculture, inspection, and public safety—depend on a continuous telemetry feed. If an attacker tampers with GPS or related signals (e.g., injecting fake waypoints, shifting timestamps, or causing coordinate jumps), the drone can deviate from its route, behave unpredictably, or crash.

The operational constraint is important:

This project targets that exact setting: anomaly detection without labeled attacks during training.


2) Data: DJI flight logs and anomaly types

We used DJI drone telemetry logs (Kaggle) with waypoint‑level binary labels for evaluation. Training uses normal flights only; anomalies are used only for validation/test evaluation.

Anomaly families (9)

Anomaly type What it looks like
altitude_spike Sudden jump in altitude
coordinate_jump Abrupt GPS position change
deletion_gap Missing waypoints
heading_inconsistency Heading conflicts with displacement
injection Fake waypoints inserted
precision_rounding Coordinates rounded suspiciously
speed_inconsistency Speed doesn’t match displacement
timestamp_drift Timestamps gradually desynchronize
combined Multiple anomalies at once

Train/validation/test splits