Tesla Autopilot 8.0 Gains Radar-Centric Safety Upgrade

Tesla is in the process of upgrading its Autopilot system to better 'see' the world surrounding world and discern from potentially dangerous objects that need to be avoided and those that don't require braking. Several of the changes that will ultimately lead to a smarter Autopilot system have been introduced in version 8 of its software.

Version 8 contains dozens of small refinements, though the most significant upgrade is that of more advanced signal processing that taps the onboard radar to create a picture of the outside world. This represents a major change in strategy for Tesla and its Autopilot function.

Tesla Autopilot

"The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system. After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition," Tesla said in a blog post. "This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar."

Tesla goes on to explain that photons of that wavelength can easily penetrate fog, dust, rain, and snow, but anything metallic looks like a mirror. The problem is compounded if that metallic object is dish shaped, like the bottom of an aluminum soda can. To the eyes of a radar, the amplified signal from the concave bottom of a discarded soda can may look like a big and dangerous object, when in reality it's not cause to slam on your brakes to avoid it.

"Therefore, the big problem in using radar to stop the car is avoiding false alarms. Slamming on the brakes is critical if you are about to hit something large and solid, but not if you are merely about to run over a soda can. Having lots of unnecessary braking events would at best be very annoying and at worst cause injury," Tesla explains.

Tesla Radar

Version 8 of Tesla's Autopilot software unlocks access to half a dozen as many radar objects with the same hardware, and with a lot more information per object. That leads to more intelligent radar snapshots, which are assembled into 3D pictures every tenth of a second. By comparing that data with the velocity of the car, the onboard radar can discern between stationary and moving objects.

The other upgrade deals with overhead highway signs positioned on a rise in the road or a bridge where the road dips underneath. These situations can look like a collision course, as the navigation data and height accuracy of the GPS can't tell if the car will pass under the object or hit it. By the time it figures things out, it's too late to brake.

This is a bigger challenge for Tesla. The newest software update doesn't solve it, but it does take the first step towards a solution, which is to have Autopilot note the position of road signs, bridges, and other stationary objects, and map the data to radar. In retrospect, the car will compare the different situations and decide when it would have needed to braked, and then upload that data to the Tesla database. It's a learning process for Autopilot, and it can also create a geocoded whitelist of locations.

Over time, Autopilot will become smarter about situations it faces to the point where "the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions."