Skip to main content

Researchers use stereo cameras for drone collision avoidance

Researchers from MIT, Texas A&M University, and Universidad Politécnica de Madrid have managed to create a relatively low-cost and effective method for drones to detect and avoid obstacles in flight using stereo cameras to build a depth map of the world around it in real-time.

Drones that are capable of capturing their surroundings usually rely on LiDAR, RADAR, and microphone arrays along with visual cameras which is what the research team is focusing on. The other common options are normally associated with high price tags and therefore are only used on expensive commercial drones. On the other hand, the stereo cameras are relatively cheap and easy to get your hands on. The drone uses the stereo cameras along with gaming software powered by an NVIDIA Jetson TX2 onboard to allow drones of all sizes to utilize this technology.

How it works

As the two cameras are a known distance apart from one another, the angle at which they view the world can be taken into account to allow the drone to accurately detect objects in front of it. The stereo camera is used alongside Microsoft AirSim to train the detection model, the data captured by the stereo camera is then fed into the detect model which can output boxes around objects with a number symbolizing how confident the observation is.

The full abstract can be found below with the full paper available on the IEEE Xplore digital library.

Obstacle avoidance is a key feature for safe drone navigation. While solutions are already commercially available for static obstacle avoidance, systems enabling avoidance of dynamic objects, such as drones, are much harder to develop due to the efficient perception, planning and control capabilities required, particularly in small drones with constrained takeoff weights. For reasonable performance, obstacle detection systems should be capable of running in real-time, with sufficient field-of-view (FOV) and detection range, and ideally providing relative position estimates of potential obstacles. In this work, we achieve all of these requirements by proposing a novel strategy to perform onboard drone detection and localization using depth maps. We integrate it on a small quadrotor, thoroughly evaluate its performance through several flight experiments, and demonstrate its capability to simultaneously detect and localize drones of different sizes and shapes. In particular, our stereo-based approach runs onboard a small drone at 16 Hz, detecting drones at a maximum distance of 8 meters, with a maximum error of 10% of the distance and at relative speeds up to 2.3 m/s. The approach is directly applicable to other 3D sensing technologies with higher range and accuracy, such as 3D LIDAR.

What do you think about stereo cameras being used as an alternative method to allow a drone to see the world around it? Let us know in the comments below.

Photo: IEEE

FTC: We use income earning auto affiliate links. More.

You’re reading DroneDJ — experts who break news about DJI and the wider drone ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow DroneDJ on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel.

Comments

Author

Avatar for Joshua Spires Joshua Spires

Josh started in the drone community in 2012 with a drone news Twitter account. Over the years Josh has gained mass exposure from his aerial photography work and spends his days writing drone content for DroneDJ as well as pursuing his business.


Manage DroneDJ Push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing