Philadelphia, PA / May 3, 2019 / --- Self-driving cars have been hailed as a way to make the roads safer. Proponents of autonomous vehicles say that self-driving technology can reduce motor vehicle accidents by eliminating human error.
Watch the full video about self-driving cars to learn more.
However, testing has revealed that self-driving cars might have a long way to go before they’re truly safe. Even more troubling, they might be vulnerable to attacks by hackers.
According to researchers from four different colleges, it might be possible to trick an autonomous vehicle by altering street signs with the use of basic stickers. To the human eye, the stickers may look harmless. However, the artificial intelligence used by self-driving vehicles can interpret the stickers in a dangerous way, possibly leading to a motor vehicle accident.
Experiments Show Stickers Can Change How AI Sees a Road Sign
The researchers behind the study used stickers to “spoof” road signs, changing the words in a way that incorporates the original wording. For example, using stickers, the researchers transformed a “stop” sign into “love stop hate.” To the human eye, this might look like graffiti or a joke. However, artificial intelligence reading this type of alteration can get confused and cause a major motor vehicle accident.
The experiment was conducted by a graduate student at the University of Washington, along with colleagues from other universities. In their research, they note that they didn’t test their alterations on any actual self-driving cars.
However, they trained a “deep neural network” to read various road signs. From there, they developed an algorithm that makes changes to the signs.
In one test, the researchers found that the AI misread a speed limit sign. In another experiment, the AI interpreted a right turn sign as a stop sign or an added lane sign.
The researchers noted that their research is simply a proof of concept, meaning that they don’t believe the changes they made could fool a self-driving vehicle on the road today.
However, given enough development and fine-tuning, these types of hacking attempts could probably trick the AI behind a self-driving car, especially if someone had access to the system they wanted to target.
Other Experiments Successfully Trick a Self Driving Car
While the research from the University of Washington didn’t test any actual self-driving vehicles, another experiment successfully tricked a Tesla Model S into switching lanes through the use of stickers on the road.
According to a report, researchers were able to trick the Tesla autonomous vehicle into switching lanes, making it drive toward oncoming traffic, simply by placing three stickers on the road. The stickers made the road appear as a lane to the car’s autopilot system, and the vehicle’s artificial intelligence interpreted the stickers as a lane that was veering toward the left.
According to a Tesla spokesperson, the test’s results are “not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so.” However, experts point out that the very idea of autopilot makes most people think they don’t need to be as fully alert behind the wheel as they would be if they were driving without autopilot technology.
Humans Cause Most Autonomous Vehicle Crashes
Despite the potential problems with AI, studies show that humans are still responsible for the majority of self-driving vehicle accidents. According to a study of self-driving vehicle crashes in California that occurred between 2014 and 2018, there were 38 motor vehicle accidents involving self-driving cars operating in autonomous mode. In all but one incident, the human driver was responsible for causing the vehicle to crash.
In another 24 incidents, the study found that the vehicle was in autonomous mode but stopped when the accident occurred. In those cases, none of the accidents happened due to an artificial intelligence error. Instead, those incidents were caused by the human operator. In three of the cases, the incident was the result of a person climbing on top of the autonomous vehicle or attacking it from the outside.
Supporters of self-driving cars say these statistics show that autonomous vehicles are still much safer than a motor vehicle operated by human power alone.
There have also been several fatal self-driving car accidents. In one crash, which involved a Tesla autonomous vehicle, an investigation revealed that the human driver was inattentive at the time of the accident.
In another Tesla autonomous vehicle crash, the driver was killed when the car’s autopilot technology confused a white semi-truck trailer with the sky. An Uber self-driving car also fatally struck a pedestrian in 2018. In that case, an investigation revealed that the driver was watching television just before the accident occurred. There have also been a handful of non-fatal autonomous vehicle accidents. Compared to the number of regular motor vehicle wrecks, however, experts say these accidents happen much less frequently.
If You Have a Car Wreck in Philadelphia, You Can Consult with Accident Lawyer Rand Spear Who Specializes in Motor Vehicle Accidents of All Types
If you or someone you know has been injured in an accident with a self-driving car, you should speak with a Philly motor vehicle accident lawyer as soon as possible. Attorney Rand Spear will walk you through your legal options and fight to get you the settlement you deserve. Call the Law Firm of Rand Spear at 215-985-2424 and speak with the best car accident lawyer in Philadelphia.
Philly Car Accident Attorney Rand Spear
Two Penn Center Plaza, Suite 200
1500 J.F.K. Blvd.
Philadelphia, PA 19102
Prior results cannot and do not guarantee or predict a similar outcome with respect to any future case. Recoveries always depend upon the facts and circumstances of each case, the injuries suffered, damages incurred, and the responsibility of those involved. This article is not to be considered advise, only the execution of the contingency agreement with this law firm will constitute an attorney-client relationship. The contents of this article are for general information only. If you would like to pursue a claim, please contact an attorney immediately to discuss your specific facts and circumstances regarding your claim. Some cases accepted by this law firm may be referred to or worked on by other lawyers, depending on the area of practice and specifics of a particular case.
There is no offer to sell, no solicitation of an offer to buy, and no recommendation of any security or any other product or service in this article. Moreover, nothing contained in this PR should be construed as a recommendation to buy, sell, or hold any investment or security, or to engage in any investment strategy or transaction. It is your responsibility to determine whether any investment, investment strategy, security, or related transaction is appropriate for you based on your investment objectives, financial circumstances, and risk tolerance. Consult your business advisor, attorney, or tax advisor regarding your specific business, legal, or tax situation.