We talk a lot about autonomous vehicles at Darwin; they’re a fascinating subject, after all. In our posts about autonomy levels and autonomous convoys, though, we haven’t addressed one of the most interesting autonomy questions: how do autonomous vehicles work?
This is a broad topic, so it makes sense to take it one step at a time. In today’s post, we’re going to talk about LiDAR sensors, and how self-driving cars use them to make sense of their surroundings.
LiDAR stands for ‘Light Detection and Ranging’. You might notice its similarity to ‘radar’, an acronym so common it’s usually written lowercase, which stands for ‘Radio Detection and Ranging’.
LiDAR and radar are similar; both systems send out electromagnetic waves and measure how long it takes those waves to be reflected back to a sensor. In doing so, they can determine how far away things are and build an image of their surroundings. Radar does this by sending out radio waves, whereas LiDAR does the same thing using light.
Autonomous vehicle developers, such as Navya, often incorporate LiDAR sensors into their vehicles.
If a vehicle is designed to operate without a human driver, it needs to be able to ‘see’.
You can use satellite navigation to tell an autonomous vehicle where it is, where it needs to go and what route it needs to take in order to get there. This is valuable information, but it’s not enough by itself. With maps and satnav alone, a vehicle might know that there’s a building to the left, but it won’t know that a sheep has wandered into the road. Autonomous vehicles need to be able to monitor their surroundings in real time, because it’s not possible to inform them of every potential obstacle in advance.
Light famously travels at phenomenal speed, which means that LiDAR sensors can pick up on changes in the environment almost instantly. This is valuable for autonomous vehicles, which need to detect and respond to obstacles straight away in order to operate safely.
Navya’s autonomous shuttles use a large number of sensors to monitor their surroundings, including LiDAR sensors. Two LiDAR sensors, located above the vehicle’s front windscreen, can each detect obstacles in a 360° range. Six more LiDAR sensors, each able to detect obstacles in a 180° range, are distributed across the rest of the vehicle.
These LiDAR sensors are accompanied by cameras, odometry sensors and a GNSS antenna for satellite navigation. All of these constantly feed information to the shuttle’s onboard computer, and the computer uses this information to make navigation decisions in real time. You can read more about Navya’s use of LiDAR on the Navya website.
We’re currently trialling one of these autonomous shuttles at Harwell Science and Innovation Campus, with the European Space Agency and UK Space Agency’s support. For more information, take a look at our Darwin Autonomous Shuttle page.
LiDAR sensors aren’t exclusively used for autonomous vehicles. They can also be found on roads in a very different context: they’re used in some speed cameras and handheld speed detection devices.
A LiDAR speed detection device will send out a pulse of light to determine the position of a vehicle, then a second pulse to determine how far the vehicle has moved. By comparing the distance the vehicle has travelled against the time between the pulses, the device can determine how fast the vehicle is moving.
There are also uses for LiDAR off the road. Some phone developers have started incorporating LiDAR sensors into the cameras of their products. The depth recognition of LiDAR makes it possible to improve camera focus and augmented reality apps.
3D scanning is another interesting thing that LiDAR is capable of. It’s possible to move around an object or environment with a LiDAR sensor, scanning it, and create a digital 3D model of that model or environment. This has strong possibilities for, for example, video games. In the 2018 blog post ‘Announcing Project Atlas’, EA’s chief technology officer Ken Moss writes about how LiDAR can be used on a game development platform:
In one example, we are using high-quality LIDAR data about real mountain ranges, passing that data through a deep neural network trained to create terrain-building algorithms, and then creating an algorithm which will be available within the platform’s development toolbox. With this AI-assisted terrain generation, designers will within seconds generate not just a single mountain, but a series of mountains and all the surrounding environment with the realism of the real-world.
Although many of LiDAR’s uses may seem futuristic, it can also help to give us a better understanding of the past. Archaeologists can use LiDAR from the air to gain a detailed view of a landscape, making it easier to spot signs of human impact. Light can’t penetrate the ground, but to some extent it can show what’s under vegetation; BBC Future has an article talking about how LiDAR gave new insight into a Mayan city hidden in the Belizean jungle.
At Darwin, we’re mainly interested in LiDAR’s applications for autonomous vehicles, but it’s a fascinating and versatile tool. With all these uses, it might not be long before ‘LiDAR’ is as common a household term as ‘radar’ is now.
Darwin Innovation Group is a UK-based company that provides services related to autonomous vehicles and communications. If you’re interested in working with us, take a look at our careers page. If you’d like to know how we can help your organisation make use of autonomous vehicles, contact us. You can also follow us on LinkedIn or Twitter.