Are eyes more important than ears? Does touch beat smell? Just like in human anatomy, autonomous vehicles have a catalogue of ‘sensors’ that make sense of the world in the absence of a human driver. But is one sensor more important, or better, than the other?
Let’s take cameras, for example. In daylight, if there’s not too much sunlight in the lens, they can see areas very well. However, in darkness and when there’s poor visibility, an alternative like radar is used. Radar isn’t affected by light, neither is lidar, which produces its own light signal.
But this is an oversimplification, of course. And I find it problematic to label one sensor as superior to another, as perhaps others have suggested. This is because to us, the primary indicator of effectiveness is (and always will be) safety. Sensors perform different roles in different contexts – sure, sight alone is great, but why exclude hearing if you can have it as well?
The challenges we face when it comes to different sensors are defined by the use case, i.e., where and in what conditions sensors will be used in – a quarry versus a highway, for example. So rather than examine which sensor is ‘better’, it’s more beneficial to discuss which sensors are better suited to certain challenges.
We’ve got to take into consideration how external conditions will affect perception – whether that’s other vehicles or obstacles in the vicinity, or if there are adverse weather conditions that could impair performance. We select sensors that are state of the art in their respective fields and then adapt how we use them in our system, depending on use case.
In a confined area, like a quarry, we know exactly what’s in the Autonomous Operating Zone (AOZ). It’s on us to decide what kind of vehicles and road users can be in the areas we operate. If, for example, we know there’s going to be a long dark tunnel we can adapt our perception and AD kit for this particular use case. That means perception, in the form of sensors, can be simplified with object detection and localization. Of course, weather is a variable we can’t control, but we have a good understanding of how various sensors perform in these conditions.
Areas that aren’t confined become trickier. The moment we go on public roads, we’ll have a greater demand put on us and further perception technology will be needed as this is a far more complicated, ever-changing environment. We also need to be able to see further ahead, as we are travelling faster, and perform safely in all road and weather conditions.
As mentioned before, use case defines which sensors are used in which context. In Mining & Quarries, a constantly changing and very challenging environment, lidar is used as a main tool for object detection and to perceive the surrounding environment. Lidar provides clear and accurate object detection, shooting off light in different directions that then bounces back to give us the distance from every point.
For Hub to Hub, we need lidar with a longer range to see further due to the high speeds of travel. We also need to adapt sensors for the environment they’ll be in. That means protecting them from vibrations and shocks, as well as ensuring they’re at the right temperature to prevent freezing or overheating.
We must ensure that sensors are being designed for functional safety and cyber security. This means adhering to legislative requirements such as ISO 26262 and the Machine Directive. We must also ensure no one is tampering with devices or the overall system. We do this by designing a system that is safe from the beginning, taking a holistic approach to safety from the very first step.
Sensor technologies are evolving all the time – we must evolve how we use them and find new functionalities and values, e.g., how we interpret data, for each sensor. Sensors must evolve from a safety and cybersecurity point of view – e.g., radars can be of higher resolution for improved environment detection; cameras can be improved to see in worse conditions and detect temperature differences; imaging radars can gather more information, etc. These are the kind of developments I’d like to see moving forward.
As I’ve said before, we take a holistic view on safety, so it would be foolish to limit ourselves to one technology. Different sensor solutions perform better in different conditions – cameras are good, but not in darkness or bright sunlight, for example.
At Volvo Autonomous Solutions, we’re designing solutions and systems that must be safe – therefore this kind of approach is not useful to us or our overall mission. With a combination of sensors working together, we can add safety and productivity to our system, and we know that this will benefit our transport solutions in a variety of use cases. We provide the end-to-end autonomous solution for our customers – it’s therefore important we invest in reliable and robust sensor portfolios to ensure efficient, safe, and sustainable transport solutions.