There are many reasons that self-driving cars – or autonomous vehicles – can be dangerous, including software malfunctions, sensor errors, hacking vulnerabilities, human over-reliance and more.
As of May 2022, 392 accidents were reported to the NHTSA involving vehicles with level 2 driver assistance systems, which are vehicles which control deceleration, acceleration and steering. 273 of the accidents involved Tesla models and six of the accidents involved deaths.
Call us today for a free consultation if you or a loved one has been injured or killed in an accident involving a self-driving vehicle. We have been representing those injured in accidents, including those injured by autonomous cars, since 1990 and have recovered hundreds of millions of dollars for our clients. We can help you recover compensation for your medical bills, pain, suffering, lost wages, loss of earning capacity, wrongful death and more. Call us today.
What Are the Top 5 Dangers of Self-Driving Cars?
Sensor Malfunctions
Machine error needs to be considered when it comes to the safety concerns of self-driving vehicles. While it seems like self-driving cars prevent more accidents than they cause, the risk of a car or truck accident being caused by machine error is not completely eliminated. In addition, should a vehicle part or software fail, a self-driving car could put its driver in more danger than they would be were they controlling the vehicle themselves.
One form of machine error is sensor malfunctions. Accidents may be the result of sensor malfunctions or hazards not being identified because of sensors’ limitations. Sensor malfunctions can lead to driverless cars crashing into pedestrians, lane dividers or street signs. Sensor malfunctions can also lead to rear-end accidents when driverless cars can’t gauge the distance between themselves and the vehicles in front of them.
One challenge for sensors is detecting obstacles at long distances and high speeds.
AI Limitations
Another form of machine error is AI limitations. AI can’t always prevent or predict the behavior of humans such as car or truck drivers when others don’t follow traffic rules. AI limitations, for example, can lead to self-driving cars having difficulty when navigating complicated intersections.
Both sensor malfunctions and AI limitations can cause self-driving cars to fail to recognize objects such as debris, animals or pedestrians. AI limitations can also lead to self-driving cars having difficulty understanding and anticipating what animals or people will do.
Hacking Concerns
Self-driving vehicles rely upon computers to function. Firewalls can be bypassed by hackers who access these computers to override controls and redirect vehicles. Criminals may use self-driving vehicles to cause gridlock and accidents. Hackers may also access computer systems to access vehicle owners’ data, increasing identity theft risks. Hackers can also identify if drivers are home by accessing these vehicles’ data. Hackers could even reprogram driverless vehicles to behave like explosives or attack obstacles, or steal self-driving cars by taking control of them.
Autopilot Over-Reliance
Basically, self-driving cars, right now, mostly can’t be completely autonomous. If a human relies too much on their autopilot, an accident will likely happen sooner or later.
Essentially, when human drivers rely too much on self-driving technology, it can lead to human error. For example, a human driver may relax too much and not monitor their vehicle because they’re relying too much on the self-driving technology. This can place them in danger should they be about to hit something and they react too slowly to take control and avoid the accident.
Autopilot can be a blessing and a curse. For example, autopilot systems may activate when they shouldn’t, or may fail when they should activate. This can lead to disaster when drivers are inattentive. Autopilot may even cause drivers to forget how to manually drive.
Liability Issues
There is concern regarding liability and self-driving cars. It isn’t inherently clear what party assumes liability in accidents caused by self-driving vehicles: the manufacturers, owners or mechanics of the cars, or even separate parties. Right now, there is no federal or state standard which assigns blame for accidents like these.
In addition, it’s tough to say if insurance companies will try to deny claims regarding accidents that involve self-driving cars. When insurance companies choose to act in bad faith, denying self-driving vehicle accident claims, attorneys have little precedent which dictates their legal right to a response.
What Are Additional Dangers of Self-Driving Cars?
Additional dangers of self-driving cars include:
Lack of Regulations
Self-driving vehicle regulations are still in an early phase. There are vastly different standards and rules in different regions, and this makes it hard for manufacturers to comply with regulations. As an example, some states, such as Arizona and California, actively support use and testing of autonomous cars, while other states have strict bans or regulations in place.
The lack of consistency regarding self-driving cars’ legal regulations leads to barriers for deploying and testing self-driving vehicles. Companies often need to adjust plans to meet different limitations and requirements in different states. One example is the initial autonomous car program that Uber had. It continued in Arizona while being stopped in San Francisco because of regulatory issues.
Privacy Violations
Vehicles which rely on computerization for vehicle safety collect data on passengers. This data collection may not inherently be dangerous at first glance. It could, as an example, make it less difficult for passengers to reach common destinations.
However, concern has been expressed regarding self-driving cars collecting data and that data being sold in an unapproved manner. Audiences driving to some cities or stores may be unfairly targeted using this data by advertisers.
Those who find data collection like this to be invasive may wish to forgo self-driving vehicles in favor of more traditional vehicles.
Fire Risk
Many self-driving cars use lithium-ion batteries. These batteries are highly combustible and lithium fires can reach extremely high temperatures of 3,632 degrees Fahrenheit, or 2,000 degrees Celsius.
Software Glitches
Software glitches can lead to mistakes which human drivers wouldn’t make, like running red lights or phantom braking.
Ethical Dilemmas
There are ethical dilemmas involved in the development of self-driving cars such as how they should prioritize the safety of passengers versus the safety of pedestrians in accidents.
Workforce Issues
There are concerns that self-driving vehicles will end up replacing human drivers in taxis, buses, trucks and cars.
Motion Sickness
Self-driving vehicle passengers might experience motion sickness.
Are Self-Driving Cars More Dangerous?
No. Self-driving vehicles are, in general, safer than vehicles driven by humans. However, specific situations or driving tasks can cause them to be more dangerous.
While a study found that autonomous vehicles aren’t as likely to get in accidents as human-driven vehicles, autonomous vehicles are more likely to get in accidents in low-visibility conditions or when turning.
California crash data from 2016 to 2022 was analyzed by the study, which utilized a statistically-matched case-control method in order to find accidents which happened in similar circumstances.
Some claim that not enough information exists regarding self-driving cars’ safety since not every crash gets reported to the police. In addition, some companies will avoid driving on freeways, and freeways typically have less crashes per mile.
Autonomous vehicles might require human guidance in some environments and situations, and might need to be able to drive safely 100 percent of the time.
How Likely Is It for a Self-Driving Car to Crash?
Self-driving cars get in accidents, but how likely it is that one will crash will depend upon numerous factors including the environment, the driver and the technology.
Regarding technology, more automated vehicles may be more likely to make mistakes. System failures such as software glitches, product defects and mechanical failures can lead to accidents.
One major factor in accidents is human error, and the line between autonomy and assistance can get blurred. The obligations of a driver when a collision occurs can affect a collision’s outcome.
Roadway conditions such as weather, traffic and obstruction can impact safety. Failure to follow traffic laws can cause accidents.
Additional factors in self-driving car accidents include mechanical failures like flat tires, whether systems are engaged, misinterpretations of other drivers’ actions, and other drivers not anticipating autonomous vehicles’ actions.
Tesla’s vehicles have gotten in a lot of accidents, involving wrong-way driving, phantom braking and crashing with emergency vehicles.
There have been crashes reported regarding Waymo’s driverless cars.
Self-driving vehicles get involved in around 9.1 accidents per million miles. This rate is higher than the rate for conventional vehicles: 4.1 crashes per million miles. However, the injuries involved in self-driving car accidents are typically minor when compared to the injuries involved in conventional car accidents. Self-driving cars get in more rear-end collisions than conventional cars, but less broadside and pedestrian accidents.
Who Is Liable in a Self-Driving Car Accident?
If someone is harmed by a driverless car, the vehicle manufacturer is usually held liable for the collision, as the software and systems of the car are considered responsible for the accident instead of the owner or passenger. This means injured parties will likely file lawsuits against car companies seeking compensation for losses. This is especially the case when the accident occurs while the autonomous driving mode is in an active state.
When driverless car accidents happen, thorough investigations are done to figure out if the accidents are because of malfunctions in vehicles’ hardware or software, or if they’re due to human error like improper usage of autonomous features.
There is potential for shared liability. Depending upon the circumstances, additional parties like software developers or even vehicle owners could possibly share liability in driverless car accidents.
What Do I Do If I’m Injured in a Self-Driving Car Accident?
Seek Medical Attention
Prioritize your health immediately, seeking medical treatment for your injuries. Do this even if you don’t think you’re hurt badly or at all. It’s vital to have any hidden injuries documented by a professional if you want to be able to recover compensation for them. If you wait too long to have any injuries diagnosed, insurance companies can claim something else caused your injuries and refuse to pay for them.
Report the Accident
Provide accident details to authorities after contacting them.
Collect Information
If it’s possible, gather evidence such as pictures of the vehicle damage and scene, as well as any data from your vehicle’s system which is relevant.
Consult With a Lawyer
You should consider talking with a legal pro who handles personal injury cases, such as the lawyers at Nadrich Accident Injury Lawyers, to understand your options and rights for pursuing financial compensation.
Contact a Car Accident Attorney Today
Call us today for a free case evaluation if you or a loved one has been injured or killed in an accident involving a driverless vehicle.
We have been helping those injured in accidents, including those injured by automated driving systems, for over 30 years and have recovered over $750,000,000 for our clients. Our extensive experience in handling traffic accident cases will allow us to recover the most compensation possible for you.
When we get clients who are injured by driverless cars and can’t afford treatment for their injuries, we get them to doctors who treat them on a lien, meaning they’re not charged until their case is over. Doctors do this for our motor vehicle accident clients because they know we get great results for our clients.
If you’ve been injured by self-driving technology, we won’t charge you any fee until and unless we recover compensation for you, so you’ll never owe us any out-of-pocket fee or upfront fee. We will only charge a percentage of whatever money we end up recovering for you, so if we recover no money for you, you will owe us nothing.
Call us today for a free consultation, text us from this page or fill out this page’s free case evaluation form.