Autonomous Cars Will Learn From Those They’ll Ultimately Replace – The Driver
A team of researchers from Cornell University has been developing a new system in which cameras focus on the driver in order to understand what they’re doing, and to help to predict their actions. This will help autonomous vehicles “work out how they should behave, too,” an article in Gizmodo stated.
The team built a Recurrent Neural Network which analyzes information from a number of data sources in order to predict what the driver will do. For example, it examines speed, GPS positions, and head orientation of the driver to “assign a probability to whether or not the car will change lanes in the next few seconds.”
The team published a paper in arXiv and reported that the software “can anticipate maneuvers 3.5 seconds before they occur in realtime.” While impressive, the study looked at data from 10 different drivers who covered 1,000 miles of freeway and city over two months. Gizmodo author Jamie Condliffe argues lane changes of freeways are probably some of the easiest to predict.
MIT professor of robotics John Leonard argues that cities such as Boston, with it’s frequent construction, detours, and confusing street configurations could spell trouble for these vehicles. “I expected to be impressed, because the people involved are so good,” Leonard said in a Boston Globe article referring to the chance he had to take a ride in Google’s automated car when he was in San Francisco for a conference. But points out that these vehicles will need to be able to navigate the way humans do, and will need to be tested under a number of different and difficult driving conditions.