This article first appeared in Clean Fleet Report.
As a CTO working in this area of innovation, I field a lot of requests from people in both the tech investment and ADAS industry communities wishing to better understand biomimetic radar. In doing so, I realized an article might be just the thing to provide a deeper understanding of a new approach to radar in automated vehicles—with tremendous potential to bridge the perception gap for all levels of ADAS.
What does “biomimetic” mean?
Biomimetic engineering is the emulation of the models, systems, and elements of nature for the purpose of solving complex problems. Velcro® is probably the best known example of biomimetic engineering. Swiss inventor George de Mestral took a burr (the seed of a burdock plant) from his dog’s coat and observed it under a microscope: each burr was covered with barbed hooks that made it the perfect hook and loop fastener. Marveling at nature’s design, de Mestral eventually replicated it and a multi-billion dollar product was born.
How can radar be biomimetic?
To harness the power of biomimetic engineering, it’s important to think beyond organic materials to the design solutions they exemplify. Inventor de Mestral didn’t manufacture Velcro out of burdock seeds, he just emulated their design principles,“beta-tested” through millions of years of evolution.
The organic design principle underlying radar is echolocation. Bats and marine mammals send out sound waves and use the echoes that bounce back to determine the range, size, shape, motion and velocity of objects in their path.
The role of radar in today’s ADAS is primarily basic object detection and avoidance. Think cruise control and lane change detection. But consider the virtuosic movement of a bat in flight, avoiding obstacles and predators at speeds over 80 mph. In total darkness. Current ADAS is just scratching the surface of echolocation’s potential – if ADAS sensor technology can be properly optimized through applying biomimetic models.
The answer to this question is, ultimately, a solution that works. Velcro wasn’t really Velcro until it worked. Inventor de Mestral experimented with cotton and other materials before finally arriving at the nylon and polyester manufacturing solution that successfully mimicked the success of that evolutionary designed seed.
In my experience, the daily work of innovation is really just informed trial and error. You have to learn the history of the problem you’re trying to solve so that, at the very least, you’re “failing forward,” adding new insights to the evolving body of knowledge.
What’s the history of biomimetic engineering as it relates to radar-assisted ADAS?
First, I think it’s important to acknowledge that all of ADAS is fundamentally a biomimetic engineering challenge: we’re building machines that drive – a complex combination of behaviors that only humans have been able to master. It’s easy to get caught up in dueling technology platforms and forget that what we’re talking about here is a fundamentally human activity that we are all trying to emulate.
The history of radar-assisted ADAS in production automobiles begins in 1999 when Mercedes introduced Distronic on its S-Class sedans; the first adaptive cruise control (ADAS Level 1) to be offered on a production automobile with radar-assisted automatic braking. More than 20 years later, the radar systems installed in most of today’s production vehicles are fundamentally unchanged: “basic” radar that scans according to instructions pre-set before the vehicle leaves the factory.
Radar systems that use AI to dynamically and adaptively scan the environment based on changing conditions have been science-fact, not science-fiction, for more than a decade – at least in the aerospace and defense industries where the technology was pioneered. There are now several companies racing to adapt this “cognitive radar” to the needs of the automotive industry.
All working “cognitive” radars represent a significant leap over current mass-produced ADAS radar tech and have contributed greatly to the knowledge-base of how we model human perception to support safe, autonomous driving – but perhaps not in the way their engineers intended.
What makes cognitive radar “cognitive” is the machine learning model of the AI that’s driving it. Cognitive AI’s optimize for large-scale computational processes like monitoring stock market fluctuations or facial pattern recognition. This kind of AI is like a Cuisinart – dicing huge quantities of data into something uniform and consistent.
Cuisinarts are great – but not for performing surgery. And the AI challenge of autonomous driving is much more surgical: slicing through all the cognitive noise of the driving environment to see only what’s necessary for safe navigation. Biomimetic radar AI emulates human perception models to achieve this lightning fast context and focus.
This question brings forward an important insight re: how NOT to do biomimetic engineering – make sure the natural system you’re emulating is the right one to solve your engineering challenge. Velcro inventor de Mestral had it easy. Burrs have only one remarkable design feature: they are really good at sticking to looped materials.
Humans are much more complicated. The “simple” act of driving requires the seamless interaction of multiple systems: vision, cognition, proprioception (position and movement of the body), and the central nervous system. On which system(s) should our biomimetic engineering focus?
We tend to identify with our cognitive abilities because, in the zoological scheme of things, we’re remarkably good at it. “I think therefore I am.” Unfortunately, it’s our cognitive tendency to have our mind someplace besides here and now that can make us remarkably bad at driving. According to the National Highway Traffic Safety Administration (NHTSA), in terms of slowed reaction time, distracted driving is up to six times more dangerous than drunk driving.
So far, we’re seeing an unfortunate tendency for cognitive radars to behave like the distracted drivers they were intended to replace – witness the NHTSA’s recent actions to curb ADAS-related accidents. But it’s not just this failure to perform reliably that’s concerning – it’s also the cognitive model that’s driving it.
Sound cognitive assessments require a systematic analysis of all relevant data. It makes sense that cognitive-modeled ADAS systems would require a complex perception stack to integrate the mountain of data generated by high resolution cameras, LIDAR, and radar. The problem with sifting all that data? It takes time.
The one thing in critically short supply when your life flashes before your eyes at freeway speeds.
How is biomimetic radar really different from cognitive radar?
I think the best way to express this is by relating a “biomimetic moment” I had the other day. I was driving down the freeway, on a hands free call. The traffic slowed. I changed lanes and tapped the brakes. Suddenly, a car swerved into my lane—changing everything.
Like many of you, what happened after the car in front of me swerved was not a conscious effort. As a matter of fact, I found that I didn’t even recall what we were talking about or what the person even said during this time. My sympathetic nervous system took the proverbial wheel and my cognitive mind took a backseat (or was unceremoniously shoved in the trunk). The next thing I knew, I was out of harm’s way and frankly stunned at the speed at which my survival skills analyzed the situation, determined the threats and the safest course of action, and executed—all in an instant.
These kinds of fight or flight responses—honed by millions of years of evolution to do one thing: keep us alive while we navigate the world—work really, really well. In retrospect, it can seem like a “duh” to model autonomous driving more on our beautifully efficient autonomous nervous system than on our terrifically complex cognitive brain. But insights like this are usually only obvious when viewed in the rearview mirror.
Biomimetic Radar optimizes for these insights using machine learning and edge-processing to focus only on those critical environmental details essential for safe navigation like position, proximity, direction and speed. For those still wrestling with how radar tech can be biomimetic, look back at my “biomimetic moment,” or, better yet, recall one of your own – when making an evasive maneuver, are you taking in the scenery along the side of the road or observing visual details like the way sunlight reflects off the car in front of you as you speed towards a potential impact? Of course not. But a camera or LIDAR based ADAS system is. Biomimetic radar, like human perception, has the ability to focus only on what’s important.
There’s so much “vaporware” in the ADAS space – how is this different?
This is probably the question that I get most often. As mentioned above, dynamically adaptive radar tech has been a science-fact in aerospace and defense for many years. While the application in the ADAS space is new, the basic tech is all solid state, thoroughly proven and optimized for the rigorous demands of aerospace and defense applications. And much of what’s game-changing here is actually on the software side – biomimetic algorithm enhancement can make anybody’s sensor tech work smarter and all perception stacks run smoother. This means there’s tremendous potential for all existing players in the ADAS space to uplevel their performance with improvements we’re seeing in biomimetic radar.