Tesla Self-Driving Beta Test Failures on Public Roads, Concerning Safety Experts

Tesla Self-Driving Beta Test Failures on Public Roads, Concerning Safety Experts

Self-driving technology is a modern marvel. But Tesla, known for bringing self-driving vehicles to the public, is under fire for recent updates to their self-driving systems that has been known to make a myriad of mistakes.  A recent report from Consumer Reports shows that people are concerned that Tesla’s full self-driving software is not as safe as the company makes it out to be, and that everyday drivers are the test subjects. Safety advocates fear that autonomous driving technology is developing faster than policymakers can legislate it, leaving policy gaps and a lack of accountability for companies like Tesla.

Everyday Drivers Testing Dangerous Technology

In July of this year, Tesla began issuing updates to its’ vehicles’ software via the air. This update, beta version 9 of their Full Self-Driving software, is an advanced autonomous system that relies fully on cameras, as opposed to previous systems who relied on both cameras and radar. But with the new tech comes new dangers. A number of Tesla owners have noticed their vehicles displaying hazardous behavior on the roads, such as making unsafe lane changes and unprotected left turns.

Consumer Reports released a report stating their concern for the new software updates. In the report, they call out Tesla for essentially making consumers unwittingly test their new technology. In a statement, Jake Fisher, the senior director of Consumer Reports’ auto test center stated that drivers are “paying to be test engineers of developing technology without adequate safety protection.”

While other companies who use self-driving vehicles such as Argo AI, Waymo, and Cruise limit their software testing to closed courses and/or have trained back up drivers monitoring the vehicle tests, Tesla is essentially allowing their consumers to be guinea pigs on the open road. This is endangering not only their drivers, but others around them as well. Bryan Reimer, the founder of the Advanced Vehicle Technology consortium, states that while drivers are at least aware of the risk assumed by self-driving vehicles, “other road users [such as] drivers, pedestrians, cyclists, etc., are unaware that they are in the presence of a test vehicle, and have not consented to take on this risk.”

Safety advocates are also criticizing the lack of driver monitoring in self-driving Tesla vehicles. At the very least, they argue, Tesla should be monitoring drivers utilizing the self-driving technology in real time to ensure they are not just paying attention, but are actively engaged while the vehicle is driving. Without adequate driver support, test vehicles on public roads can have deadly consequences. For example, a self-driving test vehicle for Uber fatally struck a pedestrian in Phoenix in 2018; the driver was watching videos on her cellphone and did not stop the vehicle in time.

The Consequences of Unregulated Technology

While some companies have enacted internal safety protocol in hopes of keeping their drivers and others on the roads safe, there is a lack of sufficient regulation on the federal level, leaving complicated questions of legality and liability when self-driving technology causes a car accident.

Furthermore, some states have more-detailed regulations than others. The inconsistency of state regulation has led to some companies skirting the rules by moving their bases to states with more lenient guidelines. For example, in 2015, autonomous trucking company Waymo expanded their testing efforts to the lone star state, taking advantage of the lack of laws against vehicles lacking steering wheels and pedals in Texas. They have since joined forces with J.B. Hunt to put self-driving 18-wheelers on Texas roads.

The Tesla Difference

But whereas the aforementioned shipping partnership boasts years of test drives on closed courses, tens of billions of miles driven via computer simulation, and over 20 million miles driven on public roads, Tesla does not. Instead, Tesla places beta versions of software updates directly into the hands of their drivers, which means they are largely tested in real time, on public roads, by unsuspecting Tesla drivers.

A professor at the American University School of Public Affairs by the name of Selika Josiah Talbott, who studies autonomous vehicles, called Tesla a “lone wolf” for this approach, compared to their peers in what she describes as an otherwise safe industry. For example, self-driving software company Waymo states that each time they change their software, the updates are thoroughly tested through a combination of simulation testing, closed-course testing, and testing on public roads, and a Cruise representative stated that their vehicles are only sent on public roads only once the company is sure their vehicles are safer than those operated by live drivers.

Tesla drivers have noticed the alarming errors as well, with videos cropping up online of the system update in action. One video uploaded by Youtuber AI Addict shows the Tesla scraping against shrubbery, occupying the wrong lane, making unprotected turns, and driving directly towards a parked car, while another from Youtuber Chuck Cook shows the Tesla sticking out into oncoming traffic while attempting multiple unprotected left turns. Other drivers have complained the software drives “like a drunk driver,” and at times, the software has been known to disengage at-random, forcing the driver to take over without notice. And most recently, five Houston-area law enforcement officers made plans to sue Tesla after a Tesla in autopilot mode crashed into a traffic stop, ignoring the officers’ vehicles’ flashing lights and injuring the officers.

Steps Towards Regulation

While the National Highway Traffic Safety Administration has ordered new accident reporting requirements for accidents involving self-driving vehicles, especially those when the autonomous driving technology was in-use. This will hopefully allow companies to better monitor the way their vehicles’ automated systems are functioning on public roads in live situations and thus adapt accordingly to better protect future drivers in similar traffic scenarios.

On August 13th, the NTHSA launched an investigation in response to the amassing reports of accidents caused by the company’s self-driving systems. The probe is set to cover an estimated 765,000 vehicles, manufactured between 2014 and 2021. In the document they released, they stated that their investigation will “assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” and that they will also further examine the contributing circumstances for eleven confirmed collisions believed to have been caused by the technology.

That said, the technology is developing too rapidly for the government to adequately respond in a timely manner. Safety watchdogs are urging policymakers to implement safety regulations to ensure that autonomous driving companies are being safe and that they are held accountable when not. Unfortunately, without the implementation of more stringent federal regulations as a motivator, safety advocates fear that Tesla will likely stay complacent with their self-driving technology.  In the meantime, with car accidents increasing year by year, the consequences can be deadly to both Tesla drivers and everyone around them.

If you or a loved one has been injured in a car or truck accident, it is imperative that you speak with an attorney as soon as possible. Our experienced Houston car and truck accident lawyers have been successfully representing people injured in car accidents for over 25 years, and we can help you too. Consultations are free, and you don’t pay unless we win. Call us at 713-224-9000, or fill out our contact form here.