Could Autonomous Vehicles Take Over Public Transportation?
As seen in partnerships with Uber in Summit, N.J., and Altamonte Springs, Fla., using human Uber drivers to transport residents may be cheaper than building parking lots or running a bus system; replacing drivers’ personal vehicles with autonomous vehicles may be the next step.
The vision of an autonomous fleet for transportation-as-a-service is shared by many companies, including Northrop Grumman. Its ACUGOTA project, being conducted under the auspices of the Defense Advanced Research Projects Agency (DARPA), aims to develop an intelligent system to operate in airports and enable seemless integration of unmanned and manned sytems within the air traffic management system. The intelligence to adapt to the fast pace and quickly changing environment of airport runways may someday influence how self-driving vehicles operate on the highway.
Path to Autonomy
Self driving vehicles and buses won’t replace human drivers for years or even decades to come. Lyft co-founder Jonn Zimmer posits that there will be three phases for autonomous vehicles on public roads:
- Traveling on fixed routes only
- Traveling everywhere at speeds less than 25 miles per hour
- Fully autonomous vehicles freely traveling
For one thing, driverless cars will need to exceed human performance when it comes to accidents. Google reported to the California Department of Motor Vehicles (DMV) that, within a one-year span, its self driving cars experienced 272 failures and they might have crashed as many as 13 times if their human pilots hadn’t intervened. Other companies have also filed such reports with the DMV, but no information has been released.
While they’re better than human drivers in clear, daylight conditions on known roads, autonomous vehicles falter in less-than-ideal situations, such as:
- Fog, rain or snow
- Inadequate or missing road markings
- Deviations from mapped conditions, such as road repairs
Many Challenges, Many Approaches
Automakers and their suppliers are tackling these problems with a variety of technologies.
More/better sensors: LIDAR sensors that bounce beams of light off the environment to measure the distance of objects are important for giving autonomous vehicles a 360-degree view of surroundings; they need to be made smaller and less expensive for vehicles in commercial production.
HD mapping: The maps used in today’s car navigation systems provide a rudimentary plot of roadways. High-definition (HD) maps not only give self driving cars route information; they include the number of lanes, positions of guard rails, road signs and driving rules. HD maps can also be updated in the cloud in close to real time by taking in data transmitted by other persistently connected vehicles.
More powerful chips: Cramming enough processing power to let a car respond quickly enough to unforeseen events requires new kinds of chips that replace the big computers installed in the trunks of demo vehicles. Nvidia released Drive PX 2, a car computing platform it says combines artificial intelligence, deep learning and HD mapping. Mobileye’s EyeQ 4/5 is a system on a chip that handles processing of signals from sensors, real-time mapping and “world view generation,” that is, an accurate view of real-world conditions. And Northrop Grumman holds the Guinness World Record for creating the fastest integrated chip; it operates 1,000 times faster than a cellphone, according to Inside Aerospace.
Artificial intelligence: It takes time for humans to become good drivers; they need weeks or months of practice. Automakers want their autonomous vehicles to learn from experience as well, so that their systems can continually improve once the car is on the road. Artificial intelligence lets self driving vehicles detect patterns and learn from the behavior of other cars on the road.
Deep learning: This artificial intelligence technique aims to replicate the human brain by creating artificial neural networks that can crunch through massive computational problems and make inferences based on past experience. Combining deep learning with multiple graphics processing units can enable a car to “see” and evaluate what it sees – better than a human driver.
Aviation: Many technologies crucial to autonomous vehicles may have their roots in the aviation sector. For example, the U.S. Navy’s X-47B Unmanned Combat Air System Demonstration (UCAS-D), a program executed with prime contractor Northrop Grumman, recently succeeded in the first-ever launch and recovery of an unmanned aircraft from a carrier at sea, according to.
Before autonomous vehicles can share the roads and take over from humans, there are also legal and regulatory issues to hash out. The U.S. Department of Transportation recently produced guidelines parceling out the responsibilities for regulating driverless cars among states and the federal government – but that still leaves individual states to figure out their policies.
There’s a similar debate about unmanned aerial systems flying in national airspace. The Federal Aviation Authority’s regulations haven’t caught up with advances in autonomy. FAA regulations require unmanned aerial vehicles to be piloted by a human, according to Lexology. Google and Amazon say they’re ready to provide drone delivery of goods, according to The Verge, but they’re hampered by current regs.
Liability in the case of a crash of an autonomous vehicle of any kind is another issue to be addressed. Volvo has agreed to accept liability for its driverless cars, but how autonomy will affect individuals’ insurance policies is unclear.
Before autonomous vehicles can take over from traditional buses and taxis, they’ll need to be able to do everything a human bus driver can: differentiate between someone waiting for the bus and a pedestrian about to cross the street, safely travel through bustling city streets, avoid darting pedestrians and make some tough choices.
These technologies are strong steps in that direction, but humans will need to stay behind the wheel – whether or not they have their hands on it – for years to come.