Tesla’s autonomous driving technology on the road

SAN FRANCISCO – Kevin Smith has a love-hate relationship with driving. He was hit twice in a short period of time, his daughter crashed his car weeks after getting her driver’s license, and her mother chose to return hers after she started missing red lights.

“I felt like I needed better driver assistance or was going to have a panic attack,” he said.

Smith is now part of a group of at least 12,000 beta testers for Tesla’s polarizing “Full Self-Driving” software, which can perform many daily driving tasks, albeit at times in unpredictable ways. Despite its flaws, Smith thinks it’s safer. He is ready to take on the task even if he knows that he may have to intervene when the software makes mistakes: passing a red light, driving on tram tracks or almost hitting a person in a pedestrian crossing, as much. scenarios that beta testers interviewed by The Washington Post encountered on the road.

“It distresses me,” he said in an interview. “I observe more. I am more aware of everything around me. I feel more secure with it.”

At the heart of Tesla’s strategy is a bold bet that the thousands of chosen test pilots, many of whom have passed a safety check that has monitored their driving for a week or more, will collect enough real-world data to improve. quickly the software on the fly. Navigating public roads with unproven software, Tesla’s full self-drive beta testers not only volunteered to help, but took responsibility for any mistakes the software might make.

The Post surveyed half a dozen beta testers who paid up to $ 10,000 to be able to upgrade their cars with the software. Self-proclaimed Tesla fans, the testers were all in awe of what the software can do, but well aware of its limitations and the risks involved. Some beta testers found the software too inconsistent and painful to use, and criticized Tesla for releasing it too early.

“At first when I heard that this was going to be released to the public, I was like, ‘Oh, not good,’” said a former engineer at Tesla Autopilot, who had early access to the beta of self-driving and spoke on condition of anonymity, fearing retaliation from the company. He remembers thinking, “This is not ready to be put into public hands.”

Tesla did not respond to a request for comment. The company has advocated for its software over normal driving in the past. For example, it said its Autopilot driver assistance software is safer than normal driving, citing crash data comparisons. Tesla CEO Elon Musk called the autopilot “unequivocally safer.” This data, however, is not directly comparable as the autopilot is limited to certain types of roads and conditions.

The race to develop fully autonomous vehicles, among the most ambitious tech companies since the moon landing, is now a multibillion-dollar industry. The players include big automakers, Silicon Valley giants like Apple and Google, and a host of deep-pocketed startups, all pouring money and people into the effort.

Each company has its own approach to testing technology, although most companies pay “security drivers” to methodically test the software in the real world. Many of them pay hundreds or thousands of “labellers” to painstakingly mark up “high definition” maps to ensure that cars know the roads, speed limits and traffic signs, even without the range of signs. expensive sensors placed around vehicles.

Tesla’s approach stands out as radically different. The software – which Musk called a “questionable” upgrade in July – has been used to update cars already on the roads and in the hands of drivers.

Safety experts and autonomous driving companies say the move is reckless and short-sighted. The National Highway Traffic Safety Administration recently asked deployers of autonomous vehicles and advanced driver assistance systems to report numerous crashes within a day of the incidents being reported, fueling fears that Tesla’s boldness may not calls for stricter regulations, slowing progress in the industry.

Although the name suggests otherwise, Tesla’s full self-driving software is not self-sustaining and drivers should be careful with the road at all times. Full Self-Drive is an evolution of the previous Autopilot software suite, which could drive vehicles from a freeway ramp to an exit ramp, making lane changes and navigating in marked lane lines. . Fully autonomous driving extends these capabilities to residential and urban streets.

For Tesla’s guinea pigs, the promise of full autonomous driving, albeit risky and unproven, offers an immediate antidote to the monotony of traffic and a silver lining for safer roads, where 20,000 Americans have died in the first half of 2021 alone, an 18% go up. Many Tesla owners dismiss security concerns about the software in part because they don’t think it could get any worse.

Christopher Hart, former chairman of the National Transportation Safety Board and now a safety consultant, said driver assistance systems like Tesla’s should be tested more thoroughly before they hit the streets.

“It makes an already skeptical audience even more skeptical,” he said of the effort. Hart, who advised Uber on its safety culture after a fatal crash involving one of its autonomous vehicles, said he wants the technology to succeed and believes it could potentially save lives if done in such a way. careful and caring.

It’s a big bet that the technology will improve faster than the liability associated with testing it on the roads, said Andrew Maynard, professor at Arizona State University and director of its Risk Innovation Lab.

“It’s a gamble that can pay off – if there are few serious incidents involving drivers, passengers, other road users [et cetera], consumer opinion continues to support the company and Tesla remains ahead of regulators, I can see a point where the safety and usefulness of FSD far exceeds concerns, ”he added.

But drivers say their experience shows that day is far away. Some were surprised on an October day when Tesla vehicles started behaving erratically after receiving an overnight software update. The cars began to brake sharply at highway speeds, which Tesla said came after false triggers of the forward collision warning and automatic emergency braking systems caused by a software update.

The company subsequently issued a recall and the owners – including Smith – said they were appalled by his actions related to the move.

Others have soured in the meantime.

Marc Hoag, a self-proclaimed Tesla fanboy and shareholder of his shares, waited a year and a half to get the software. But once he tried it, he was disappointed.

“It’s still so incredibly bad,” he said.

Hoag said the driving experience is worse in person than it looks on the videos he posted on YouTube, which show the car taking too wide turns, accelerating around the bends and taking a traffic sign. pedestrian crossing for a pedestrian – while acting apprehensively at intersections alongside other vehicles. . Its choppy wheel and indecisive braking make driving unpleasant, and its unpredictable nature makes it scary, he said.

“It seems to me remarkably, shockingly premature to allow it to be tested on the public highway,” he added. “You can’t test it and not be reckless at the same time.”

In 2019, Musk boldly promised that the company’s cars would have the ability to drive themselves – turning Teslas into a fleet of 1 million “robotaxis” by 2020. It’s now clear that robotaxis don’t. were not on the horizon, and a California DMV memo reported by Bloomberg – and posted on legal transparency site PlainSite – suggested Tesla knew it was too promising.

Many loyal Tesla drivers don’t blame Musk for promising the impossible.

“It’s understanding the game Elon is playing with all of us. Most of us have understood that and are ready to help,” said Nicholas Puschak, a Tesla owner who is anxiously awaiting his turn to become a Full Self beta tester. -Driving software. “Tesla really pushes the boundaries, not only in the technology, but also in the way they publish it. I don’t think any other company would have the guts to do what Tesla is doing.”

Part of the draw for most beta testers surveyed by The Post is that they may play a small role in advancing technology faster. When the car does something wrong, drivers can press a button on the car’s screen that sends data about the incident to Tesla, so software engineers can analyze it and improve the algorithm.

While pushing that button is fine, it might not help much, said Mahmood Hikmet, a New Zealand-based autonomous automotive engineer who works in another part of the industry focused on autonomous shuttles.

He and other experts in the field say Tesla can’t use his cars’ data fire hose at the same time. Other companies require safety drivers to test specific problem areas at specific times, rather than having them drive aimlessly.

“You don’t really need 12,000 or more people,” Hikmet said. “For a company with the sixth largest market cap in the world, it could hire 50 to 100 in-house testers and run into all the problems it has … You can’t put together a lot of things.”

For Chris, buying a Tesla has “changed his life,” as the autopilot function has helped him improve his grueling commutes that begin on the dirt road that exits his rural home in Fenton, NW. Michigan, and ending in Ann Arbor.

Chris, who asked to use only his first name for safety reasons, saw the car do some completely wrong things in early beta testing, like stopping for no reason or almost crashing into a sidewalk. But he stepped in and drove the car safely every time and said the car performs over 90 percent of the drive without a problem.

“I will not put anyone in danger. I have to be careful of the cars around me,” he said. “You are potentially part of this future of transportation because you are essentially driving this car.”

A Tesla Model 3 electric vehicle leaves after being charged at the Tesla Supercharger station in Kettleman City, Calif., July 31, 2019. MUST CREDIT: Bloomberg photo by Patrick T. Fallon.


Source link

Comments are closed.