Tesla’s AutoPilot

An innovative step Tesla took four years ago is now putting them well ahead of the autonomous car competition. In 2012 the automotive manufacturer introduced an around the clock cellular internet connection. As MIT Tech Review’s Tom Simonite has pointed out, this connectivity has proved to be what Tesla needs to make it’s own self-driving cars smarter and safer.

Tesla can use this connection to gather data to help guide its initiative toward self-driving cars. This connection allow Tesla to retrieve data from drivers, observing how a human reacts and behaves on the road in a variety of situations, including light traffic vs. heavy traffic or clear vs. stormy conditions. This data is then used as a benchmark for its constantly self-driving technology.

But Tesla isn’t limited to just pulling observational data. According to Sterling Anderson, Tesla’s Autopilot Program director, the company can upload new software to cars on the road. This constant stream of data exchange has put the auto manufacturer ahead of its primary competitors like Google and (according to some rumors) Apple.

Eighteen months ago Tesla began installing new sensors onto its vehicles, under the premise that they were a part of a new auto-brake feature. But underneath that selling point was the fact that the sensors would also detect nearby objects on the road, enabling engineers to see exactly how drivers were reacting to them on a day-to-day basis. Anderson boasts that every ten hours Tesla receives one million miles of data, and since the sensors became standard in 2014 they have accrued 780 million miles’ worth of driver data.

This information only makes their autonomous software better, and if it performs well against the data, it’s uploaded to the cars for the feedback loop to begin once more. When the software is used on a self-steering vehicle, they can accurately see how well it holds up to a human standard. By leveraging a large data set from real world drivers, Tesla looks to be able to quickly iterate on its self-driving software.

Even though software updates are regularly being installed on Tesla’s vehicles, drivers needn’t worry about autopilot actually taking over when the driver is behind the wheel. But a major issue that Tesla needs to address if/when autopilot is made available is human responsibility. Some drivers, expecting self-driving cars to function free of error, can behave recklessly and find themselves unprepared to take over in case of a malfunction. Google, for example, ran tests where employees were allowed to use self-driving cars. The results were troubling: one employee, while traveling at 65 miles per hour, actually plugged in his cellular device to his laptop. Google has completely restructured its program, but Tesla is still convinced that Autopilot can coexist with responsible human driving practices.