Some Pennsylvania motorists may have heard about the death of a pedestrian in March 2018 after being hit by a self-driving Uber car. In Tempe, Arizona, where the accident occurred, the police chief said that the car was likely not at fault in the accident, but a professor at Arizona State University says there is a flaw in how autonomous vehicles are being taught to drive. According to him, teaching them to drive like humans means they will make the same errors that human drivers make.
In this case, according to video footage of the accident, a pedestrian stepped directly into the path of the vehicle in an area where there was not a pedestrian crosswalk or lights. The professor says that the problem is that the car was behaving the way a human driver would in assuming that because there were no obstacles in its visual range, none existed. He says that autonomous vehicles should only proceed at a speed that allows them to stop if something appears within the area that they are able to detect.
The professor is currently working on a system that is supposed to guarantee a car's ability to brake a millisecond after it detects an obstacle. He points out that while a fatality is a tragedy for a human driver, it could shut down the autonomous car industry.
Many motor vehicle accidents are caused by human error, such as driving while distracted or sleepy, and drivers who cause accidents that injure people may be liable for the victims' expenses. However, there might be complications in obtaining this compensation. For example, some types of injuries, including whiplash and traumatic brain injury, may not show immediate symptoms after the accident, and an insurance company might try to deny they are connected. An attorney may be helpful in seeking an appropriate settlement.