Sign In  |  Register  |  About Corte Madera  |  Contact Us

Corte Madera, CA
September 01, 2020 10:27am
7-Day Forecast | Traffic
  • Search Hotels in Corte Madera

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Consumer Reports concerned Tesla uses owners to test unsafe self-driving software

A Tesla in full self-driving mode makes a left turn out of the middle lane on a busy San Francisco street. It jumps in a bus lane where it’s not meant to be. It turns a corner and nearly plows into parked vehicles, causing the driver to lurch for the wheel. These scenes have been […]

A Tesla in full self-driving mode makes a left turn out of the middle lane on a busy San Francisco street. It jumps in a bus lane where it’s not meant to be. It turns a corner and nearly plows into parked vehicles, causing the driver to lurch for the wheel. These scenes have been captured by car reviewer AI Addict, and other scenarios like it are cropping up on YouTube. One might say that these are all mistakes any human on a cell phone might have made. But we expect more from our AI overlords. 

Earlier this month, Tesla began sending out over-the-air software updates for its Full Self-Driving (FSD) beta version 9 software, an advanced driver assist system that relies only on cameras, rather than cameras and radar like Tesla’s previous ADAS systems.

In reaction to videos displaying unsafe driving behavior, like unprotected left turns, and other reports from Tesla owners, Consumer Reports issued a statement on Tuesday saying the software upgrade does not appear to be safe enough for public roads, and that it would independently test the software update on its Model Y SUV once it receives the necessary software updates. 

Running preproduction software is both work & fun. Beta list was in stasis, as we had many known issues to fix.

Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid.

Safety is always top priority at Tesla.

— Elon Musk (@elonmusk) July 9, 2021

The consumer organization said it’s concerned Tesla is using its existing owners and their vehicles as guinea pigs for testing new features. Making their point for them, Tesla CEO Elon Musk did urge drivers not to be complacent while driving because “there will be unknown issues, so please be paranoid.” Many Tesla owners know what they’re getting themselves into because they signed up for Tesla’s Early Access Program that delivers beta software for feedback, but other road users have not given their consent for such trials. 

Tesla’s updates are shipped out to drivers all over the country. The electric vehicle company did not respond to a request for more information about whether or not it takes into account self-driving regulations in specific states — 29 states have enacted laws related to autonomous driving, but they differ wildly depending on the state. Other self-driving technology companies like Cruise, Waymo and Argo AI told CR they either test their software on private tracks or use trained safety drivers as monitors. 

“Car technology is advancing really quickly, and automation has a lot of potential, but policymakers need to step up to get strong, sensible safety rules in place,” says William Wallace, manager of safety policy at CR in a statement. “Otherwise, some companies will just treat our public roads as if they were private proving grounds, with little holding them accountable for safety.”

In June, the National Highway Traffic Safety Administration issued a standing general order that requires manufacturers and operators of vehicles with SAE Level 2 ADAS or SAE levels 3, 4 or 5 automated driving systems to report crashes. 

“NHTSA’s core mission is safety. By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that could emerge in these automated systems,” said Dr. Steven Cliff, NHTSA’s acting administrator, in a statement. “In fact, gathering data will help instill public confidence that the federal government is closely overseeing the safety of automated vehicles.” 

The FSD beta 9 software has added features that automates more driving tasks, like navigating intersections and city streets with the driver’s supervision. But with such excellent graphics detailing where the car is in relation to other road users, down to a woman on a scooter passing by, drivers might be more distracted by the tech that’s meant to assist them at crucial moments. 

“Tesla just asking people to pay attention isn’t enough — the system needs to make sure people are engaged when the system is operational,” said Jake Fisher, senior director of CR’s Auto Test Center in a statement. “We already know that testing developing self-driving systems without adequate driver support can — and will — end in fatalities.”

Fisher said Tesla should implement an in-car driver monitoring system to ensure drivers are watching the road to avoid accidents like the one involving Uber’s self-driving test vehicle, which struck and killed a woman in 2018 in Phoenix as she crossed the street. 

Tesla backs vision-only approach to autonomy using powerful supercomputer

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 CorteMadera.com & California Media Partners, LLC. All rights reserved.