Crashing is inevitable for a lot of untrained pilot when piloting drone for the first time. When they realize that it’s so fast, they tend to get panic and crashed the copter. Control is also an issue when you fly your drone in a tiny space or among the crowded trees in a forest. Because of the many many crashes that happens, MIT Computer Science and Artificial Intelligence Laboratory (MIT CSAIL) and Stanford’s Autonomous System Laboratory (ASL) developed their own algorithms to help drone evades obstacles around them.
MIT’s CSAIL researcher and also PhD student, Andrew Barry, with his thesis advisor, professor Russ Tedrake, developed a stereo-vision algorithm that ran 20 times faster than existing software. The algorithm allows the drone to detect objects and build a full map of its surrounding in real time. Operating at 120 frames-per-second, the software extracts depth information at a speed of 8.3 miliseconds per frame. This means that the drone can fly at high speed, detect objects with the camera on high speed, swerve through objects and avoid crashing.
Barry’s stereo-vision algorithm focused on improving data processing and maneuverability on fixed-wing drone. While Stanford’s ASL developed a real-time kinodynamic planner for quadrotor to avoid obstacles. Kinodynamic planning is a class of problems for which velocity, acceleration, and force/torque bounds must be satisfied, together with kinematic constraints such as avoiding obstacles.
Ross Allen and Marco Pavone from The Autonomous Systems Laboratory of Stanford University have developed an algorithmic framework that allows for real-time motion planning/obstacle avoidance for high speed robotic systems. Same as Barry’s stereo-vision algorithm, the drone can detect objects, and swerve through said objects, and avoid it altogether autonomously. The framework utilizes an offline-online computation paradigm, neighborhood classification through machine learning, sampling-based motion planning with an optimal cost distance metric, and trajectory smoothing to achieve real-time planning for aerial vehicles. This framework accounts for dynamic obstacles with an event-based replanning structure and a locally reactive control layer that minimizes replanning events. The framework has been successfully demonstrated real-time planning for static obstacles, low-speed dynamic obstacles, and even high-speed, adversarial obstacles.
And those are a couple of software improvement for drones to make them nimble and avoid crashing. What do you think about it? Is it complete? Or do you think it would need some improvement? Share with us your tought on the comment section.