This space is dedicated to our Hackathon Project: SHARK USV Autonomous Navigation & Intelligent Obstacle Avoidance
<aside>
π Welcome to the team! Our mission is to design and protoype an autonomous navigation system thatβ¦
- Consumes and interprets real-time LiDAR scanning data as well as a video feed to detect, classify, and track nearby vessels, buoys, and static hazards
- Implements swarming algorithms to produce smooth, natural waypoint plans to a given destination and updates them dynamically as new sensor data arrives
- Predicts obstacle trajectories using temporal tracking and proactively adjusts course to avoid conflicts using emergent navigation behaviors
- Maintains robust navigation during challenging maritime conditions including sensor noise, water reflections, and partial occlusions
π’ Communication
- Email contact: george@deepdock.xyz
- Coordination/Preparation: Konstantinos
- Hackers: Deno, Konstantinos +2
- Mentor: George
</aside>
Technical Approach
Perception Layer:
- 360Β° scanning LiDAR system (TF03 rangefinder on servo actuator) β to be designed and built up during hackathon
- GoPro Hero7 camera with object detection neural network (YOLO/MobileNet/custom model)
- Real-time obstacle detection and clustering
- Sensor fusion combining visual semantics with geometric ranging data
- Temporal tracking for moving object velocity estimation
- Sensor fusion with GPS/IMU for robust localization
Navigation Layer:
- Bio-inspired boids algorithm for multi-objective path planning
- Weighted behavioral rules (obstacle avoidance, goal seeking, path smoothing)
- Dynamic speed control based on obstacle proximity
- Emergency stop logic for critical collision scenarios
- Class-aware safety margins: larger buffers for boats/kayakers, smaller for buoys/markers
- Dynamic speed control based on obstacle proximity and threat level
- Emergency stop logic for critical collision scenarios
Control Layer:
- MAVLink interface to ArduRover autopilot
- Real-time heading and velocity commands
- Failsafe integration with manual override capability
Success Criteria
Minimum Viable Demonstration:
- SHARK autonomously navigates to a waypoint 50+ meters away
- Camera detects and classifies at least 2 object types (e.g., "boat", "buoy")
- LiDAR provides ranging to detected objects
- System fuses vision + LiDAR for enhanced obstacle awareness
- Maintains safe clearance margins (>5 meters from hazards)
- Generates smooth, boat-appropriate motion paths
Stretch Goals:
- Detect and avoid moving obstacle with trajectory prediction
- Demonstrate class-specific safety behaviors (wider berth for boats vs. buoys)
- Complete multi-waypoint survey mission through obstacle field
- Achieve >90% object classification accuracy on maritime objects
- Real-time visualization showing fused LiDAR + vision perception
- Predict obstacle trajectories and proactively re-route
- Demonstrate recovery from GPS degradation using dead reckoning
Provided Resources
Hardware (All sensors and equipment will be provided):
- SHARK USV platform (1.7m hull with propulsion and steering ) β already working on manual
- Orange Cube autopilot with ArduRover firmware
- TF03 LiDAR rangefinder with 360Β° servo scanning system
- Here4 GPS module for global positioning
- Herelink remote control system with video telemetry
- GoPro Hero7 black camera for mission documentation
- LiPo battery system with full day endurance
- Waterproof electronics enclosures
- NVIDIA Jetson Nano/Orin Super for advanced processing
- Raspberry Pi 3+ SENSE for basic processing
Additional hardware can be purchased on Saturday 11.10.2025 if needed.