Come see us at Commercial UAV Expo Americas, 7-9 September, In Las Vegas, NV - Booth #814

WPCS 2.1.7
WPCS 2.1.7

UAVs: Position hold, collision avoidance and SLAM

LightWare’s mission is to give machines the gift of “sight”, allowing them to “sense” their world and enable machine perception. We believe that if machines can ‘see’ and ‘perceive’ as humans do, they will be able to avoid so many of the shortcomings that contribute to data errors and inaccuracies, not to mention pitfalls such as catastrophic collisions. This ability to sense has so many valuable applications, all of which arm the LiDAR equipped machine with accurate insights and produce far more precise results.

Position hold

Whereas many applications focus on using LiDAR to assess height above ground, which is achieved through a vertically orientated  microLiDAR sensor to obtain data, position hold makes it possible to a maintain a relative position at a safe distance from an object, which is achieved through a forward-facing or horizontal orientation of the MicroLiDAR. Stability is, naturally, key here, as the object is to conduct accurate surveillance or inspection; however, it’s just as important to ensure that the drone or UAV on which the MicroLiDAR is mounted does not collide with the object in question. This is where LightWare’s MicroLiDAR sensors excel. This application is often used for tasks such as object tracking to ensure security, or surveillance or maintenance of targets such as power lines, cellphone towers or wind turbines.

 

Collision avoidance

As UAVs and drones are often used beyond visual line of sight (BVLOS), one of the most frequently encountered challenges, is that they are prone to collisions, whether with power lines, buildings, birds, features in the terrain, or even slow moving objects. The use of a LightWare microLiDAR mitigates this risk almost completely by providing information about the drone’s environment, which the drone can use to compute a new, collision-free route. This information is collated by employing LiDAR lazer light emissions  and the “time-of-flight” principle to make precise, accurate, and fast measurements thus making it possible for these potential obstacles to be identified (and avoided) rapidly and accurately. And what’s more, LightWare’s MicroLIDAR sensors easily integrate into popular flight controller systems, such as Ardupilot and PX4.

 

Watch this video created by Randy Mackay of Ardupilot and watch LightWare’s SF45/B microLiDAR put its  collision avoidance capabilities to the test –

https://www.youtube.com/watch?v=SPu0a23FGKc

 

SLAM

SLAM (Simultaneous Localization and Mapping) makes it possible for a drone to develop a map of its surroundings while, at the same time, identifying its own position within this map. SLAM is required when a UAV or drone is operating in GPS denied environments such as indoors or underground. In this way, a drone is able to develop an accurate view of its immediate environment, including obstacles and potential threats. LightWare’s microLIDAR sensors are able to develop maps at speed, with great accuracy, by using the time-of-flight principle to make precise, accurate, and fast measurements. Moreover, on-board algorithmic processing ensures accurate measurements across a broad range of target colors and textures, which means that it delivers excellent results in all circumstances. And as all processing occurs on-board, no dedicated processors are required, saving weight, cost, and power. This information is invaluable for helping machines navigate their paths without difficulty. SLAM application have proven especially to ensure safety in search and rescue missions.

 

More LiDAR Basics

IoT
IoT
The Internet of Things (IoT) refers to a seemingly futuristic system of interrelated, Internet-connected machines and objects that are able to both collect and transfer data and make decisions. This data is transferred over a wireless network, without human intervention via sensors, and other embedded technologies.