Prototypes of driverless cars and trucks are becoming more and more common. This technology’s path to the roadway, however, has at times been rocky.
More often than not, problems with autonomous vehicles have something to do with human error. As the Telegraph reports, at least one pioneer in the industry was forced to abandon its partially automated vehicle program and move toward fully driverless cars just a few years ago to remove human mistakes from the equation. However, even with complete autonomy, other drivers on the road can still make fully self-driving vehicles prone to accidents.
How Autonomous Cars Work
Autonomous vehicles are cars that drive themselves. These cars come equipped with computers and sensors capable of taking in information about their surroundings and making decisions based on that information.
A fully autonomous vehicle is one that operates solely at the command of this computer, without any input by a human driver. However, there are also cars that run with lower levels of autonomy, requiring varying levels of human action.
The automotive engineering standards organization known as SAE International has devised a scheme of classifications for different levels of autonomy in. These levels range from Level 0, at which the computer has no control over the car and can simply issue warnings to a human driver, to Level 5, or full autonomy.
The area in the middle — especially at Levels 2 and 3 — involves a sharing of responsibility between the computer and a human driver. These levels may have seemed at first like a good starting point for introducing driverless car technology gradually, but in fact, this gray area creates opportunities for careless drivers to shirk their share of the responsibility, potentially causing accidents. These opportunities became a reality for one autonomous vehicle company in the not-too-distant past.
Google’s Problems with Partially Automated Vehicles
Google is one of the leading companies in the automated vehicle field. Its driverless car division, Waymo, is making great strides toward normalizing the technology. In fact, just last year, Waymo became the first company to put driverless cars on the road without test drivers to correct them in the event of a failure, according to The Verge. But earlier incarnations of Google’s autonomous vehicle products were not as successful as their more recent ones might seem, the Telegraph reports.
It appears that some of Google’s earlier experiments took drivers only partially out of the mix, “assisting” humans and performing some driving tasks for them instead of taking over completely. The human driver was still responsible for some degree of decision-making. This type of technology might qualify as a Level 2 or Level 3 on the SAE’s scale of autonomy.
In theory, this arrangement might sound like the safest option, allowing a highly advanced computer to perform basic tasks, while relying on humans with real driving instincts to make the final call when conditions on the road get dicey. In reality, however, some drivers failed to uphold their end of the bargain.
Google has recently admitted that some of the drivers it employed to test its partially autonomous vehicles began to take their eyes off the road in 2013. Apparently overconfident in the car’s ability to fend for itself, they turned their attention to other tasks, including doing their makeup or even falling asleep at the wheel.
Waymo executives claim that they shut down their partially autonomous testing project shortly after that, fearing that this sort of behavior could result in more collisions. The company has since moved toward developing and testing fully automated vehicles exclusively. If they are successful in creating vehicles that truly need no human input, whoever is riding in the car will be free to attend to whatever business they need, but the potential for a crash will not necessarily have been eliminated.
How Fully Autonomous Vehicles Can Crash
Once there is no longer a human behind the wheel and a driverless car’s computer becomes advanced and experienced enough to adapt to every situation, the greatest danger that comes with that car being on the road lies in the human drivers around it. This is because of the difference between the way humans and computers drive.
Fully autonomous vehicles are programmed to follow every traffic law exactly. Unlike humans, they are unwilling to fudge the rules in the interest of convenience, speed, or even courtesy to. This can cause other drivers on the road who are not fully paying attention to crash into them after making incorrect assumptions as to how they will act.
When fully automated vehicles are involved in accidents, therefore, it is usually a human driver who is at fault. This is because the extreme caution driverless cars use ironically makes them somewhat unpredictable since drivers expect everyone else on the road to handle situations in the way most humans do. Often, these crashes involve rear-ending the driverless car at low speeds. Nonetheless, whether the vehicle is partially or fully automated, it seems that humans continue to be the thorn in the side of driverless car companies.
What to Do If You Are in an Accident with an Autonomous Vehicle
If you were in a collision with a car that is partially or fully driven by a computer, your next steps would be the same as they would be if you were hit by any vehicle. You should still call the authorities, and if you believe you may have been injured, you should still seek medical attention right away.
If you think the driverless car may have been at fault, you may still be able to seek compensation. The relative silence of North Carolina law as to autonomous vehicle crashes means that whoever is operating the vehicle could potentially be held liable, which can vary depending on whether the car is fully or partially autonomous.
If you or someone you love has been hurt in an accident with another vehicle, and you believe someone else was responsible, an experienced personal injury attorney can help you determine whether legal action may be a smart option. Contact the Shelby car accident lawyers of Teddy, Meekins & Talbert, P.L.L.C., today by phone or online for more information about how we can help.
Additional Motor Vehicle Accident Information
- Auto Accidents in North Carolina
- Car Wreck FAQ’s
- Truck Wrecks
- Truck Accidents & Distracted Driving
- Motorcycle Accidents
If you are seeking the services of a lawyer in North Carolina, it’s important to become familiar with the experience and qualifications of the attorney who may represent you. Our relentless legal team led by David Teddy, Ralph Meekins, and Daniel Talbert at Teddy, Meekins & Talbert, P.L.L.C., is based in Shelby, North Carolina, we represent ordinary people against powerful insurance companies and the government: