Meet Marilyn, the robot car that can see through fog
VTT Technical Research Centre of Finland has invented a robot car that is claimed to see better than humans in foggy conditions and can navigate without stopping — including in bad weather.
Named Marilyn, the robot car has the ability to see a human through fog and avoid accidents automatically. This is enabled by the LiDAR mounted on the car’s roof, which can see wavelengths that are beyond the human senses. As the technology evolves, this represents a big step towards the development of safe automated vehicles.
The latest additions to Marilyn include optical component wavelengths via the 1550 nm LiDAR and additional intelligence for its software, which improves sensor capabilities. Furthermore, software modules have been built in for the filtering of point clouds and the assessment of scanner reliability. This ensures the vehicle’s ability to function, including in fog and powdery snow, under which conditions the LiDAR radar — which ‘sees’ in the visible and near infrared ranges of the spectrum — enables the robot car to see people better.
“Although Marilyn’s vision is limited to roughly 30 m in thick fog, the new LiDAR type allows the car to be driven slowly rather than having [to come to a] full stop,” said Project Manager Matti Kutila, from VTT’s RobotCar Crew team.
The car also has traditional automotive radars and LiDAR, but their detection of nonmetallic obstacles and resolution is limited, particularly when trying to recognise shapes.
“Marilyn can also combine radar and LiDAR technologies by optimising the best aspects of the different sensors,” said Kutila. “This makes the automatic vehicle safer than a car driven by a person. Although there are still a lot of obstacles in the development path, a major leap has been taken in the right direction.”
Marilyn was presented to the public at the RobustSENSE Final Event in Germany this week, where she displayed her capabilities alongside six other robot cars. Speaking prior to the event, Kutila revealed, “Marilyn will drive through a bank of fog created in a tent, which is so thick that the passengers cannot see through it. After passing through the chamber, the car will automatically avoid an obstacle in front, in this case a dummy.
“Among Marilyn’s sensors, the 905 nm LiDARs cannot see through mist — the new 1550 nm LiDAR is the only sensor based on which a decision to swerve can be made.”
Marilyn will next try out her skills in an automated parking exercise in the European summer, based on commands from outside the car. Her ‘husband’, VTT’s Martti, is also set to make the news during the summer and in November, when he makes route selections based on friction data and LiDAR.
“We still have a long way to go on the journey towards 24/7 automated driving, but we are now a big leap closer to achieving our dream,” said Kutila. “If we think of this as a 42 km marathon, we are now perhaps 10 km closer to our goal.”
Making sensors more sustainable with a greener power source
A new project aims to eliminate the reliance of sensors on disposable batteries by testing the...
Fission chips — using vinegar for sensor processing
Researchers have developed a new way to produce ultraviolet (UV) light sensors, which could lead...
Self-assembling sensors could improve wearable devices
Researchers from Penn State University have developed a 3D-printed material that self-assembles...