Overnight, a new video has been uploaded since the Computer vision and pattern recognition Conference (CVPR), where Tesla Autopilot Software Manager Ashok Elluswamy gave a presentation.
Elluswamy provided great insight into Tesla’s new development of its computer vision-based autonomous software known as Full Self Driving. Although the name confuses non-Tesla owners, the information about Tesla’s efforts to understand the world around the vehicle and navigate it safely should leave no doubt in people’s minds that they absolutely plan to make their cars self-driving.
During the conference, we saw new strategies from Tesla to address the challenges associated with achieving this goal. We saw Tesla start out with lane lines as the bounding parameter for driving, which is still used today in Autopilot, but their FSD Beta was moved to use driving space as the limiting factor, allowing the car to cross lines where appropriate, moving through complex environments in a more humane way.
Tesla also performs depth calculations from the vision system, to determine if objects are solid and should therefore be avoided, but this presents challenges such as the ability to predict by occlusion as well.
Elluswamy presents us with Occupancy Networks as the solution to these challenges. This is a new technique that takes into account inputs from 8 cameras to build a 3D representation of the volumetric occupancy around the car. This means that it doesn’t just identify curbs to avoid when cornering, but actually understands the vehicle and the available space around it (it can be extremely beneficial for Semi to avoid objects like hanging signs and even overhanging trees).
In fact, this translates to the occupied or unoccupied space, which means that the car cannot enter the occupied space, regardless of what occupies it. It can be a permanent object like a mailbox, or a temporary object like a parked car, either way the car understands that it can’t enter that space and navigates around it, while applying the constraint of the general rules of the road.
Things are just static in this network of occupation, Tesla treats all objects as if they could move, so works to predict their flow (or movement) over time to understand if they are about to move. occupy the space that Tesla is trying to get to.
What was great to hear and see in the demo was Tesla’s ability to detect objects it doesn’t have a 3D asset for and understand that it’s taking up space that isn’t available to it . That means you might not see an item in the visualization, but that doesn’t mean the car didn’t see it and fit it. I expect there will be some interesting discussions within Tesla about how to represent this, given that the amount of 3D assets in software cannot be infinite.
One of the most interesting sections of the presentation is around 18 minutes when Elluswamy discusses occlusion. Parts of the environments we drive in can sometimes be hidden from view from the cameras. It presents a real challenge for us to understand how Tesla could overcome such a problem.
An example of this is the unprotected left known as ‘Chuck’s turn‘ which is expected to be addressed in FSD Beta v10.69 dropping today. In the presentation we see an example of an intersection, where oncoming traffic is obscured from view.
Another great example of these difficult intersections is this image of Lisa, showing trees to the left of the vehicle.
To date Tesla has used a fairly basic implementation of creep, which drives the car forward to increase the angle of incidence so that the vision system and path planning have a chance of routing the car through a intersection.
What we learned today is that Tesla’s brain in the car has the ability to figure out where he has vision problems (i.e. cars that appear out of nowhere) and adapt to this lack of visibility. Think of it as smart creep, understanding the distance from the front of the car that cars pass, to appreciate the space available to sneak through. With the right amount of creep, visibility to oncoming traffic improves and allows the car to navigate the environment safely.
I think that’s what’s included in FSD Beta v10.69, making it quite a significant architectural change to how decisions are made in the car.
The unbreakable car
In the back half of the presentation, we learned about Tesla’s collision avoidance capability. We’ve seen this a bit previously, in regards to smart switching. In updated Model S and Model X, Smart Shift gives owners the ability to automatically select the direction of travel based on what the camera sees.
This collision avoidance technology now goes way beyond that and sounds like it’s coming to all Teslas. This accounts for mistakes made by human drivers in incorrectly applying the throttle, which can result in vehicles colliding with buildings, or other vehicles, or even worse, people.
By mining spatial “occupancy” data as shown above, Tesla then translates this into a probability of collision avoidance. In other words, if you face a wall and you press the accelerator, the probability that you have an accident would be almost 100%, so the car would not let you do that.
The image above is from a simulation where a distracted driver presses the accelerator and offers no steering input. The car senses that an accident would occur and plots a path to navigate the environment safely. Unlike the modest lane correction we have in cars today, turning the steering wheel just a few degrees, in this example the Tesla would drastically change course, crossing into the correct lane and continuing safely.
Elluswamy concludes the presentation by explaining that if they can successfully implement all the techniques discussed, they could produce a car that never needs to crash.
This job is clearly not done and solving it will take more effort, with the final slide of the presentation a pitch to engineers to come join Tesla in building a car that never crashes!
This will be a very interesting challenge for Tesla now that they are in the insurance business. There’s always the possibility of other cars bumping into a Tesla so insurance is always needed, however, imagining a day when all Teslas with HW3 and above get a software update that makes them indestructible (following the driver’s entrance) is wild to think about.