Teslas and Motorcycles Make for a Bad Combination
Growing up, cartoons and television shows depicted flying cars in the future. While we have not yet reached that level of innovation, we see more and more advances made toward autonomous vehicles. Tesla is leading the charge, but being first comes with its downsides.
Recently, the automotive company came under fire again after two collisions involving motorcycles and the use of Tesla's Autopilot features. It appears that the Autopilot feature does not recognize or has difficulty recognizing motorcycles.
These nighttime highway collisions, with the Teslas rear-ending the motorcycles in both instances, resulted in the motorcyclists’ deaths. While in one motorcycle accident it is unclear if there was Autopilot usage, the driver in the other accident confirmed using the Autopilot feature.
An initial investigation by NHTSA officials points at the Autopilot system being the cause of the accident. If a formal investigation confirms this, the agency will include these fatalities in a more extensive investigation of Tesla's safety practices.
Tesla Autopilot recalls
Motorcycles are not the only thing that Tesla Autopilot has failed to recognize. There have also been collisions involving pedestrians and emergency vehicles. The Autopilot feature is marketed as a safety feature for drivers, making them feel protected and more comfortable driving. Unfortunately, the opposite is happening.
Thus far, the only changes that Tesla and Elon Musk have made are removing the use of radar systems and instead only using cameras and computers. However, safety advocates note that the lack of radar systems can hinder night vision.
At a minimum, many believe the company should change its marketing. “Self-driving” and the name Autopilot imply that the driver does not need to do anything, but drivers must be ready to take over. These vehicles cannot drive themselves, but owners think they can.
Michael Brooks, acting executive director of the nonprofit Center for Auto Safety, vented his frustration to Auto Blog: “What the hell are they doing while these crashes continue to occur? Drivers are being lured into thinking this protects them and others on the roads, and it's just not working.”
How do autonomous vehicles work?
Autonomous vehicles work by sensing the environment and operating without human intervention. In a perfect world, a human would not even need to be in the car for it to operate. There are different levels of automation, and marketing can sometimes confuse drivers.
According to the Society of Automotive Engineers, there are six levels of automation, with level zero being fully manual and level five being fully autonomous. The categories are as follows:
- Zero - No Automation
- One - Driver Assistance
- Two - Partial Automation
- Three - Conditional Automation
- Four - High Automation
- Five - Full Automation
Autonomous vehicles use sensors, actuators, algorithms, and software to take in the surroundings and react accordingly. They create a map of their surroundings based on the data they collect. Radar sensors focus on the vehicles around them, while video cameras look for road signs and signals. Lidar sensors measure distance with the use of light beams.
The risk of autonomous vehicles
Self-driving and autonomous vehicles have many risks, and if the technology is not impeccable, it causes tragedy. The lidar sensor is one of these risks. Safety experts question what might happen when multiple self-driving cars are on the road. Will they interfere with each other or work in unison? Additionally, these sensors are expensive, and manufacturers must find a way to include them in vehicles when they reach stages of mass production.
- Weather. While Los Angeles does not have snow, residents face other weather conditions impacting how an autonomous vehicle behaves. When rain or fog blocks your view, it also blocks the sensors, and the cameras cannot see lane markings or other cars, which can easily cause a car accident.
- Regulations. While manufacturers can program autonomous vehicles to obey traffic laws, will they consider the local regulations for the specific area? Additionally, federal and state lawmakers are imposing different rules on how and when these vehicles can hit the road. These regulations must be considered when programming and shipping cars to consumers.
- Artificial and Emotional Intelligence. A computer program only looks at logical intelligence. The technology does not pick up on social cues or non-verbal communication. Humans can look at pedestrians, motorcyclists, and others and notice slight changes or intentions. Body language is essential while driving. The concern is whether an autonomous vehicle can pick up on these cues or if it will ignore them.
You can claim damages after a Los Angeles car accident
While you can spot a Tesla on your commute and avoid them, other vehicle manufacturers are trying to enter the self-driving automotive industry. When an autonomous vehicle accident occurs, legal liability is complex. Take the two crashes from this summer mentioned above. While one is still under investigation, the other shows that the Autopilot feature did not do what it was supposed to and led to the fatal motorcycle accident. Without the feature, the driver might have been able to avoid the incident. So, who is responsible: The driver of the Tesla? The company itself? The people who programmed Autopilot? All of them?
Tesla may not be exempt from liability, nor may the driver. While some accidents result in a fatality at the scene, others require emergency transport where the victim dies later in the hospital. Meanwhile, they experience pain and suffering throughout. Our firm focuses on your individual needs, and we firmly believe you deserve compensation for the pain and suffering your loved one incurred from the negligence of another entity.
Liability is complex, but the Los Angeles autonomous vehicle accident lawyers from McNicholas & McNicholas are ready to hold all relevant parties responsible. You have options if your loved one dies in an autonomous vehicle accident. You may recover damages for the pain and suffering your loved one incurred before they died. Call our office at 310-474-1582, or submit our contact form to schedule a free initial consultation today.
Please note that this blog is not to be construed as legal advice. Because every case is fact-specific, you should consult directly with an attorney to obtain legal advice specific to your situation.
With more than 25 years’ experience as a trial lawyer, Partner Patrick McNicholas exclusively represents victims in personal injury, product liability, sexual assault and other consumer-oriented matters, such as civil rights, aviation disasters and class actions. Learn more about his professional background here.