The interior of a Tesla Model S is shown in autopilot mode in San Francisco, California, U.S., April 7, 2016.
Alexandria Sage | Reuters
Tesla announced Tuesday that it is ditching radar for its driver-assistance features like Autopilot.
In a blog post entitled “Transitioning to Tesla Vision,” the company said its best-selling Model 3 and Model Y vehicles made for customers in the U.S. and Canada starting this month would feature a camera-based system instead to enable Autopilot features like traffic adjusted cruise control, or automatic lane-keeping.
Radar sensors are relatively expensive, and processing data from them takes significant computing power in a vehicle. Tesla has previously told shareholders that it believes “a vision-only system is ultimately all that is needed for full autonomy,” and that it was planning to switch the U.S. market to Tesla Vision. CEO Elon Musk also said the company would move to a “pure vision” approach in a tweet earlier this year.
Tesla said that these will be the first Tesla vehicles to rely on camera vision and neural net processing to deliver “Autopilot, Full-Self Driving and certain active safety features.”
The company also cautioned that Autopilot and FSD systems would not be as useful or as strong during this period of technical adjustments.
“For a short period during this transition, cars with Tesla Vision may be delivered with some features temporarily limited or inactive, including: Autosteer will be limited to a maximum speed of 75 mph and a longer minimum following distance. Smart Summon (if equipped) and Emergency Lane Departure Avoidance may be disabled at delivery.”
Customers who already ordered a Model 3 or Model Y but didn’t know about this modification will be informed before accepting delivery of their cars.
All new Tesla vehicles include a standard set of advanced driver assistance (ADAS) features dubbed Autopilot.
Tesla also sells a $10,000 premium software package marketed as “Full Self Driving” or FSD. Tesla gives select drivers early access to a beta version of FSD — effectively turning thousands of customers into software testers on public roads in the U.S.
According to the company’s website, Autopilot currently enables a Tesla vehicle to “steer, accelerate and brake automatically within its lane,” and FSD adds features like automatic lane changing and summon, the latter of which enables a driver to call their car to come pick them up across a parking lot using the Tesla app like a remote control.
Tesla cautions in its owners’ manual and on its website that Autopilot and FSD require active supervision. But some drivers incorrectly believe that a Tesla is safe to operate hands-free, asleep at the wheel or even while sitting in the back of the car.
One Tesla owner who posted social media videos of himself using Autopilot without his hands on the wheel died in a fatal collision in Southern California earlier this month. Another was arrested by the California Highway Patrol for taking his Tesla for unsafe joy rides during which he sat in the back seat, and let the car operate on public highways with no driver at the wheel.
Most using radar and lidar
Other automakers are taking a different tack when it comes to the development, rollout and marketing of automated driving systems. GM Cruise, Alphabet’s Waymo, Aurora and others are including radar and lidar alongside cameras in their systems.
While cameras capture video that can be labeled by human data analysts and interpreted by machine learning software, radar and lidar sensors provide additional data that can give cars a more robust way to detect and avoid obstacles on the road — especially when visibility is lower, including at night or in inclement weather.
Musk has called lidar a “crutch” and a “fool’s errand,” saying it’s too expensive and hard to use. But he has not dismissed radar entirely yet.
Tesla intends to keep radar in its higher-cost Model S and Model X vehicles, and Model 3 and Model Y cars made in China or for shipment to markets beyond North America.
According to Phil Koopman, CTO of Edge Case Research and professor of electrical and computer engineering at Carnegie Mellon University, Tesla should be able to get away with offering some features via vision today, but might need to re-introduce radar later on to deliver more advanced automated features.
“The sensors used by an SAE Level 2 (human driver responsible for monitoring safety at all times) is at manufacturer discretion. So it’s possible they can provide at least some features with camera only, noting that the human is responsible for handling anything the camera can’t,” said Koopman.
“Tesla’s features are currently limited to this SAE Level 2. If in the future Tesla wants to achieve SAE Level 4 (automated vehicle with no human driver safety supervision — which is not the current capability), then it would prudent to use every type of sensor they can get, including cameras, radar, lidar, and possibly others.”