P

Phoenix-Wings

Devlopment of Autonomous Navigation-Object detection based control on UAVs. Real-life implementation on Edge devices. Leveraging ML algorithms non GPU Oriented edge devices that are economical viable

P

Phoenix-Wings

Devlopment of Autonomous Navigation-Object detection based control on UAVs. Real-life implementation on Edge devices. Leveraging ML algorithms non GPU Oriented edge devices that are economical viable

The problem Phoenix-Wings solves

1. The problem we aim to tackle is to eliminate human-dependent UAV systems and make a reliable autonomous UAV solution capable of object detection and also completely leverage control over it using python-based autopilot controlling libraries such as mavsdk
2. Commonly most onboard computers on drones are extremely costly due to high-end computing is done on-board, we aim to use low-cost edge devices such as raspberry pi that just provide us a gateway-like system to the drones and perform computing on the ground, hence allowing us to be more economically smart
3. Most importantly we develop to minimize human intervention to a net 0, hence allowing drones to be smarter and truly autonomous, especially in times of disaster like floods, fires, and earthquakes to find and rescue people from a bird's eye all autonomously. some of these applications always need not be disastrous problem-solving issues, but also more welcoming to society like smart-surveillance and drone food delivery we ideally aim to reduce the infamous "30 min or free" offer to an "8 min or free offer" !

Challenges we ran into

1. _We had planned to use a simulator and replicate a real-life drone running on a preferable firmware such as ARDUPILOT however when we finally had the build for the simulator ready and the entire sequence of events for integrating the AI/ML to the drone controlling library MAVSDK-python, __the entire execution was a failure _and we learned the firmware we were using does not support a simulated environed at all ! However, we tackled it by completely switching the firmware and learning by reading its Documentation from the beginning

-2. _When __the Jury _was by us for the presentation session and judging they were very happy with the project however they asked us if it was possible to implement our idea in such a way that it would actually sit on the drone, hence using quick thinking we called a friend to get us a damn raspberry pi monitor and keyboard and we got to work to implement what we had on a 1 lakh INR laptop to a 5000 INR raspberry pi, using all lower versions the firmware that are only compatible with raspberry pi's so we individually ran through 4 different AI algorithms since most were too graphic intensive on the raspberry-pi and we finally built over tensorflow that is only properaitery to edge devices so it was quite a intensive task

Discussion