publive-image

Tesla FSD Robotaxis in ride-hailing raise alarms as Las Vegas crash highlights dangers of self-driving in regulatory gray zone

While Tesla CEO Elon Musk prepares to unveil ambitious plans for a fully autonomous ride-hailing network, many drivers admit to the software's limitations. Some avoid using it in complex environments like airport pickups, while others rely on it to reduce stress during long hours on the road. Experts are urging closer regulation, with some even suggesting ride-hailing companies ban the use of FSD until stricter oversight is in place.

With safety in the balance, the debate over Tesla's role in the future of autonomous ride-hailing continues to grow. #Tesla #FullSelfDriving #FSD #Uber #Lyft #Robotaxi #RideHailing #AutonomousVehicles #SelfDrivingCars #TechNews #SafetyConcerns #LasVegasCrash

Tesla’s Full Self-Driving (FSD) software is now being increasingly adopted by Uber and Lyft drivers. This is raising significant safety and regulatory concerns as these vehicles operate as makeshift "robotaxis." A recent accident involving a Tesla running on FSD, which collided with an SUV in suburban Las Vegas in April 2024, has spotlighted these concerns. Federal safety officials have launched an inquiry into the crash, the first reported incident of a Tesla using FSD (Full Self Driving) for Uber.

Tesla CEO Elon Musk, who is scheduled to unveil plans for a dedicated robotaxi service on October 10, has long envisioned a network of autonomous Teslas used for ride-hailing. However, drivers using FSD as part of their ride-hailing work are taking the concept into their own hands, despite the technology's limitations. According to Reuters, drivers like Justin Yoon, who was involved in the April accident, have noted that while the software can reduce stress for drivers, it is far from flawless.

Yoon was behind the wheel of his Tesla, hands off when it entered an intersection at 46 mph (74 kph) while on FSD. The software failed to detect an SUV crossing the road until the last moment, and Yoon had to take control to reduce the impact of the collision. He described the software's imperfections in a post-crash video: "It's not perfect, it'll make mistakes, it will probably continue to make mistakes." Both Yoon and his passenger suffered minor injuries, and his car was totalled.

FSD is categorized by the US government as a form of partial automation, which means drivers are required to remain attentive and ready to take control at any time. Despite these guidelines, FSD's broader use in commercial settings, particularly ride-hailing, is pushing the boundaries of existing regulations.

Uber and Lyft have both emphasized that drivers are responsible for ensuring safety, regardless of whether they use FSD. Uber stated, “Drivers are expected to maintain an environment that makes riders feel safe; even if driving practices don't violate the law.” Similarly, Lyft reiterated, “Drivers agree that they will not engage in reckless behaviour.”

Some drivers have expressed concerns over the limitations of the software. Sergio Avedian, a Los Angeles ride-hail driver and contributor to "The Rideshare Guy" YouTube channel, said, “I do use it, but I'm not completely comfortable with it.” He estimates that 30% to 40% of Tesla drivers across the U.S. who work for ride-hailing services use FSD regularly. Many drivers also avoid using it in challenging situations, such as airport pickups or construction zones.

The federal government is aware of the risks associated with FSD but has yet to introduce specific regulations governing its use in ride-hailing services. Missy Cummings, a robotics expert at George Mason University and a former advisor to the National Highway Traffic Safety Administration (NHTSA), said that oversight is crucial. “If Uber and Lyft were smart, they’d get ahead of it and they would ban that,” she suggested. Cummings and others believe that formal investigations into the use of driver-assistance technologies by ride-hail drivers are needed.

Some drivers remain optimistic about the technology's future, hoping that autonomous capabilities will eventually be advanced enough to remove human oversight altogether. Kaz Barnes, a ride-hail driver who has completed over 2,000 trips using FSD since 2022, stated, “You would just kind of take off the training wheels. I hope to be able to do that with this car one day.”

For now, however, FSD remains a work in progress, and as its use in commercial ride-hailing grows, so do concerns about the safety and regulatory challenges it presents.