The U.S. government's highway safety agency says Tesla is telling drivers in public statements that its vehicles can drive themselves, conflicting with owners manuals and briefings with the agency saying the electric vehicles need human supervision.
The National Highway Traffic Safety Administration is asking the company to “revisit its communications” to make sure messages are consistent with user instructions.
The request came in a May email to the company from Gregory Magno, a division chief with the agency's Office of Defects Investigation. It was attached to a letter seeking information on a probe into crashes involving Tesla's “Full Self-Driving” system in low-visibility conditions. The letter was posted Friday on the agency's website.
The agency began the investigation in October after getting reports of four crashes involving “Full Self-Driving" when Teslas encountered sun glare, fog and airborne dust. An Arizona pedestrian was killed in one of the crashes.
Get top local stories in Southern California delivered to you every morning. >Sign up for NBC LA's News Headlines newsletter.
Critics, including Transportation Secretary Pete Buttigieg, have long accused Tesla of using deceptive names for its partially automated driving systems, including “Full Self-Driving” and “Autopilot,” both of which have been viewed by owners as fully autonomous.
The letter and email raise further questions about whether Full Self-Driving will be ready for use without human drivers on public roads, as Tesla CEO Elon Musk has predicted. Much of Tesla's stock valuation hinges on the company deploying a fleet of autonomous robotaxis.
Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026 starting in California and Texas, he said.
A message was sent Friday seeking comment from Tesla.
In the email, Magno writes that Tesla briefed the agency in April on an offer of a free trial of “Full Self-Driving” and emphasized that the owner's manual, user interface and a YouTube video tell humans that they have to remain vigilant and in full control of their vehicles.
But Magno cited seven posts or reposts by Tesla's account on X, the social media platform owned by Musk, that Magno said indicated that Full Self-Driving is capable of driving itself.
“Tesla's X account has reposted or endorsed postings that exhibit disengaged driver behavior,” Magno wrote. “We believe that Tesla's postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task."
The postings may encourage drivers to see Full Self-Driving, which now has the word “supervised” next to it in Tesla materials, to view the system as a “chauffeur or robotaxi rather than a partial automation/driver assist system that requires persistent attention and intermittent intervention by the driver,” Magno wrote.
On April 11, for instance, Tesla reposted a story about a man who used Full Self-Driving to travel 13 miles (21 kilometers) from his home to an emergency room during a heart attack just after the free trial began on April 1. A version of Full Self-Driving helped the owner "get to the hospital when he needed immediate medical attention,” the post said.
In addition, Tesla says on its website that use of Full Self-Driving and Autopilot without human supervision depends on “achieving reliability" and regulatory approval, Magno wrote. But the statement is accompanied by a video of a man driving on local roads with his hands on his knees, with a statement that, “The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself,” the email said.
In the letter seeking information on driving in low-visibility conditions, Magno wrote that the investigation will focus on the system's ability to perform in low-visibility conditions caused by “relatively common traffic occurrences.”
Drivers, he wrote, may not be told by the car that they should decide where Full Self-Driving can safely operate or fully understand the capabilities of the system.
“This investigation will consider the adequacy of feedback or information the system provides to drivers to enable them to make a decision in real time when the capability of the system has been exceeded,” Magno wrote.
The letter asks Tesla to describe all visual or audio warnings that drivers get that the system “is unable to detect and respond to any reduced visibility condition.”
The agency gave Tesla until Dec. 18 to respond to the letter, but the company can ask for an extension.
That means the investigation is unlikely to be finished by the time President-elect Donald Trump takes office in January, and Trump has said he would put Musk in charge of a government efficiency commission to audit agencies and eliminate fraud. Musk spent at least $119 million in a campaign to get Trump elected, and Trump has spoken against government regulations.
Auto safety advocates fear that if Musk gains some control over NHTSA, the Full Self-Driving and other investigations into Tesla could be derailed.
Musk even floated the idea of him helping to develop national safety standards for self-driving vehicles.
“Of course the fox wants to build the henhouse,” said Michael Brooks, executive director of the Center for Auto Safety, a nonprofit watchdog group.
He added that he can't think of anyone who would agree that a business mogul should have direct involvement in regulations that affect the mogul’s companies.
“That’s a huge problem for democracy, really,” Brooks said.