Hyundai Invests in Metawave’s Intelligent Radar for Self-Driving Vehicles
A superior radar system with 3D vision, which provides self-driving cars a human-like interpretation of the world, is under development at Metawave Corp. Hyundai Motor Co. is investing in this intelligent Warlord radar platform.
It has the capability of interpreting the surrounding environment, and detecting and classifying objects with greater efficiency than current perception-system technology.
“Metawave is the next generation of radar that has a long-range capability, and also is able to image objects and classify objects at these long distances,” says Metawave CEO Maha Achour. “The goal is to give the car the range, the speed, the brains and the capability to operate in all weather conditions and operating environments. Without the radar it’s impossible for a car to drive at full speed on a freeway and at the same time being able to operate all by itself.”
Overcoming the limits of traditional radar
Warlord’s smart radar doesn’t have the shortcomings of conventional radar, LiDAR and cameras, the basic components of today’s driverless cars’ perception systems. Conventional radar can operate in poor weather conditions, it’s fast and can detect objects at long ranges. The downside is that it loses the ability to cover wide angles at long distances and doesn’t have the resolution to differentiate between objects. Radar also requires a large number of antennae and costly chips to run complex digital signals.
LiDAR emits pulsed laser light up to about 160 yards to collect high-resolution 3D data of the surroundings, but inclement weather and dirty road conditions affect its performance. Cameras provide high-resolution images, but also are susceptible to bad weather and dusty roads, and have a range limitation of slightly more than 50 yards.
Warlord radar can detect objects beyond 300 yards, and can operate in poor weather conditions and on dirty roads, says Achour. Unlike LiDAR and cameras that use optical technology, it can image objects with residue on the aperture.
Metawave radar is a stand-alone system, which doesn’t require feedback from other sensors. It works in harmony with other sensors, but serves as the primary sensor. It’s the key component in a “sensor fusion” technology package that includes conventional radar, camera, LiDAR, GPS and digital maps.
“We always think about the radar being the long-range sensor, and then the LiDAR and the camera pick up that task as the car gets closer to the object,” says Achour. Using the other sensors to zoom in to the specific regions the radar identifies shortens the time of processing, reduces power consumption and speeds sensor-fusion decision making.
A key element in the efficiency of Metawave’s radar platform is the use of a single antenna that allows ultra-fast and precision operation. This compares to conventional radar, which requires multiple transmitter and receiver antennae that process the signal with digital beamforming. The drawbacks of digital beamforming are insufficient resolution and limitations of only steering a beam or scanning in a horizontal direction. There isn’t any elevation information to distinguish between objects such a bus or bridge – in the horizontal space they look the same.
“In our case we use a single antenna, but this is an active analog antenna,” says Achour. “The antenna itself, independently from the base band, is basically able to shape the beam and make a very narrow (“pencil beam”) or wider beam, and also spew it both in a horizontal and vertical direction. This allows scalability of the radar to be able to address different operating conditions — in an area where you need the much wider beam to look at all of the objects or in other situations where you need the much narrower beam.”
How advance radar for cars will work
Along with the Metawave system’s ability to vary a beam’s shape, it can “see around corners.” The radar can do no-line-of-sight detection, in which secondary reflections are analyzed. The secondary reflections, such as signals that bounce off buildings or road dividers and then hit the object, are processed in the radar. It sees a moving object around the corner, and then the vehicle will slow down.
Warlord radar steers a highly directive beam that works in cluttered environments to precisely determine the location and speed of objects. It possesses superior speed and vision, as well as intelligence through its AI (artificial intelligence) engine. Achour says the AI engine is trained to identify objects.
“When we train the engine, we create what we call micro-Doppler signatures, so each object now has a signature,” she says. “When we train the AI engine it stores these signatures so the next time when the radar comes across the same objects it compares the reflective signals. Just as it compares the reflective signal, it compares the signatures stored in its database, and if it finds one that is identical or close enough, then it will say with 90 or 95 percent confidence this is a car, this is a bus. If it doesn’t know, it transfers over the objects and then it gives it to the radar to be able to train itself for the next time.”
While Hyundai isn’t commenting on Metawave’s Warlord’s role in future products, the automaker has announced ambitious plans for its autonomous-vehicle program. “Hyundai Motor Group, which includes flagship units Hyundai Motor and Kia Motors, is preparing for the commercialization of the SAE standard Level 4-compliant autonomous-driving system in smart cities by 2021,” says Miles Johnson, senior manager, quality, service and technology public relations.
Testing of autonomous Hyundai vehicles is in advanced stages. Earlier this year, a fleet of Level 4 Hyundai NEXO fuel-cell electric SUVs successfully completed a 118-mile self-driven trek through South Korea.
Achour believes autonomous technology is on a steady path that will take self-driving cars beyond fleet use such as ride-hailing services, where the initial focus has been, and onto public roads. When the technology is mature, she speculates in may happen first in specific geographical locations, maybe in China or Japan, where there are more autonomous-vehicle-friendly roads.
“The biggest unknown on the road are human drivers,” Achour says, “and once you have the human-driver factor off the road, then it’s going to be much more effective. In the meantime, the learning engine and the car itself should be able to identify some direct behavior from the human driver to take care of these situations.”