Automated vehicles are a real controversy these days. Some people claim that they’re the only logical evolution to the automobile, and that they are the eventual solution to a wide array of traffic and safety issues present in the modern world. Others however are far more skeptical, and with good reason; despite the fact that the technology is constantly evolving, it’s obvious that it’s still miles from being ready to completely take over traffic as we know it. The human element still remains integral to overall road safety, and at what pace this will change in the future is anybody’s guess at this point.
Don’t get the wrong idea, self-driving cars are a fantastic idea that definitely will solve a huge number of issues once the technology is perfected, and eventually humans won’t even have to know how to drive cars. But in the meantime, it’s very important to be aware of all the security risks that come with owning and operating a self-driving vehicle, if you plan to do so in the near future.
The Accuracy of GPS
In order for a self-driving car to navigate properly, it’s essential that the internal software is supplied with some kind of mapping information. This information is exclusively being distributed via GPS, wherein lies the first issue of self-driving cars. Anyone that’s ever tried to use GPS has almost certainly had an experience or two where the mapping information was inaccurate; whether it’s a roadblock or a completely non-existent road that for some reason shows up on the map, you can’t always depend on the navigation a hundred percent. The problem with self-driving cars is that it has no choice but to depend on the GPS a hundred percent, which can be quite dangerous if the information is inaccurate. It sees through the GPS, and therefore will not see that roadblock if it’s not on the map – instead, it will go right through it.
All technology is prone to failure at some point, and traffic signals are no different. The problem once again is the self-driving car’s utter dependence on these signals. Google’s self-driving cars use high precision maps to locate the traffic lights, then point the camera in the direction of the traffic lights to determine whether the car needs to stop or not. In the event that the traffic light doesn’t work, if it’s a bit higher or lower than what the software expects and the camera isn’t able to register it, or even if there’s simply an unmapped traffic light on the road, the car will just ignore entirely. At this point it’s also impossible for a self-driving car to register human signals, which means that a policeman replacing a broken traffic light will be powerless to stop it as well.
Self-driving cars are operated by software, and software gets hacked. If someone breaks into your e-mail account you lose a few passwords, but if the same happens to your car’s navigational computer, you potentially lose your life. Hackers are becoming very proficient at their craft and their tools are becoming better and better, so self-driving car companies will have to really build up the security of their software before anyone can be safe on the road. It’s also virtually impossible to locate a hacker since they almost always hide behind a VPN on their PC’s, which means that capital punishment isn’t a very good option either. If self-driving cars are ever going to become a thing, the navigational software will always have to be one step ahead of hackers in terms of security, or basically everyone on the road will constantly be at risk.
Since self-driving cars are relatively new technology, the majority of people haven’t had much contact with them in the past, and so they need to be thoroughly educated on how to operate it. As of today, there isn’t nearly as much precedence over the education of the driver as there should be. Manual mode is still an essential piece of the picture, and in the event that the driver needs to take over, they need to be certain that they’re able to do it quickly and efficiently before someone gets hurt. They need to learn how the car functions at least on a basic level, for their own safety and the safety of other drivers.
Self-driving cars have proven themselves very efficient in predictable conditions, but when nasty weather comes into play, that all changes quite a bit. Even the companies that make self-driving cars highly recommend against operating it in heavy rainfall and snowfall, because the level of safety is simply not sufficient. This is largely due to the fact that rain can do quite a bit of damage to the laser sensor on the roof of the car, which is an integral part in calculating the proximity of other vehicles and road obstacles.