SkillRary

Please login to post comment

Can Autonomous Car Drive Better than Humans?

  • swetha goud
  • Jun 13, 2019
  • 0 comment(s)

 

Autonomous Car

 

Imagine getting in your car, typing or speaking a location into your vehicle’s interface, then letting it drive you to your destination while you read a book, surf the web or nap. Self-driving vehicles – the stuff of science fiction since the first roads were paved – are coming, and they’re going to radically change what it’s like to get from point A to point B.

In 2009, Google launched its self-driving project focusing on saving lives and serving people with disabilities. In a 2014 video, Google showed blind and elderly riders climbing into its custom-designed autonomous vehicles, part of the company’s plan to “improve road safety and help lots of people who can’t drive.”

Although there were several self-driving projects in the country at the time, many being developed by government agencies or university labs, Google’s project differentiated itself by being public-facing. The goal was not to build cars—although Google did build its own testing prototypes—but to create a self-driving service that would help regular people get around.

Google began testing its vehicles on public streets the very same year the project launched. With the reorganization of Google into its new parent company Alphabet, the self-driving program became its own entity, Waymo. Almost a decade later, Waymo remains the clear leader for safe self-driving miles on U.S. streets.

According to Waymo’s monthly reports, its vehicles have been in two dozen crashes, only one of which was the fault of the Waymo’s vehicle. In that crash, which was in 2016, a Waymo vehicle bumped a bus while going 2 miles per hour. On May 4, 2018, one of Waymo’s minivans was involved in a crash with minor injuries in Chandler, Arizona while in autonomous mode, but police said Waymo’s van was not the “violator vehicle.”

 

Now there are dozens of autonomous vehicle companies testing on U.S. streets. As of February 2019, Waymo had logged five million self-driven miles, making it the leader for self-driven miles on U.S. streets. Over the next few months, Waymo’s fleet began driving about 25,000 self-driven miles per day, or one million miles per month. In June 2019, Waymo hit another major milestone of eight million self-driven miles as its new electric Jaguar I-Paces vehicle hit the streets.

The next most experienced companies, Uber and GM Cruise, are still several million miles behind Waymo. That doesn’t include miles driven in the semi-autonomous modes that many cars now offer, like Tesla’s Autopilot, which are more driver-assistance systems than true self-driving vehicles.

In the last few years, the greatest strides taken in the self-driving industry have been by ride-hailing companies, who are devoting an exceptional amount of time and money to develop their own proprietary technologies and, in many cases, giving members of the public rides in their vehicles.

In 2017, Lyft’s CEO predicted that within five years, all their vehicles will be autonomous. At a press conference in March 2018, where Waymo’s CEO John Krafcik announced its ride-hailing program, Krafcik claimed that the company will be making at least one million trips per day by 2020.

 

 

Can autonomous cars drive better than humans?

The biggest safety advantage to an autonomous vehicle is that a robot is not a human—it is programmed to obey all the rules of the road, won’t speed, and can’t be distracted by a text message flickering onto a phone. And, hypothetically at least, AVs can also detect what humans can’t—especially at night or in low-light conditions—and react more quickly to avoid a collision.

 

AVs are laden with sensors and software that work together to build a complete picture of the road. One key technology for AVs is LIDAR, or a “light-detecting and ranging” sensor. Using millions of lasers, LIDAR draws a real-time, 3D image of the environment around the vehicle. In addition to LIDAR, radar sensors can measure the size and speed of moving objects. And high-definition cameras can actually read signs and signals. As the car is traveling, it cross-references all this data with GPS technology that situates the vehicle within a city and helps to plan its route.

In addition to the sensors and maps, AVs run software programs which make real-time decisions about how the car will navigate relative to other vehicles, humans, or objects in the road. Engineers can run the cars through simulations, but the software also needs to learn from actual driving situations. This is why real-world testing on public roads is so important.

But how AV companies gather that information has led to greater concerns about how autonomous vehicles can detect and avoid vulnerable road users, like cyclists, and people who move slowly and more erratically through streets, like seniors and children.

Self-driving companies also put their vehicles through endless tests using the simulated city streets. Many traditional automakers use a facility named M City in Ann Arbor, Michigan, but the larger self-driving companies have built their own fake cities specifically to test interactions with humans who are not in vehicles. Waymo’s fake city, named Castle, even has a shed full of props—like tricycles—that might be used by people on streets so that Waymo’s engineers can learn how to identify them.

 

Tesla’s Autopilot feature, one of many driver-assist features which allow control of the vehicle to switch from human to computer, can distract drivers or give them a false sense of security. The Verge

Fully autonomous is the official policy recommendation from the Self-Driving Coalition for Safer Streets, a lobbying group that wants cars to eventually phase out steering wheels and let the software take over, 100 percent of the time. This completely eliminates the potential for human error. General Motors is planning to make cars without steering wheels by 2022

In 2018, Waymo began conducting fully autonomous testing in Arizona without a human safety driver. California now allows fully autonomous testing as well.

At least for the near future, even fully autonomous vehicles will still have to contend with the mistakes of human drivers. To truly make self-driving technology the safest it can be, all the vehicles on the road should be fully autonomous—not just programmed to obey the rules of the road, but also to communicate with each other.

 

 

Author: Ravinder Joshi

About: Ravinder is pursuing a Ph.D. In Algorithms & Compilers after having completed his Masters in Computers. He has his expertise spread in diverse subjects such as Machine Learning, Artifical Intelligence, Data Science, Big Data & Hadoop along with various others. Ravinder who is a Senior Software Architect also has a zeal for teaching.

Please login to post comment

( 0 ) comment(s)