Experts Estimate That The Utopic Self-Driving Car Is At Least A Decade Away From Being A Common Street Sight. Still, Automobile Companies Are Putting The Pedal To The Metal For The Day Complex Logistics And Stricter Regulation Come Together
Last year, Intel and the research firm Strategy Analytics released a report that claimed that driverless vehicles will be behind $7 trillion worth of economic activity and new efficiencies annually by 2050. That activity, according to the report, will include nearly $4 trillion from driverless ride-hailing and nearly $3 trillion from driverless delivery and business logistics.
There’s a long way to go for autonomous vehicles before they become as common as the black and yellow. Regulation, cognitive technology, navigation, and infrastructure will have to come together seamlessly so we aren’t facing incidents like last week’s accident. But there are more than a few companies pressing themselves to perfect the self-driving car.
Here’s our pick of the most exciting autonomous vehicles to come;
Google IO Alphabet’s self-driving car outfit Waymo recently submitted a a 41-page safety protocol to the California DMV. The document is part of a larger application to fully test driverless vehicles, without a person in the car, in California. In April this year, new regulations took effect in California, allowing the DMV to give permits to companies to conduct the driverless tests.
This, after we came to know that Waymo has been doing extensive testing in the city by loaning out cars to beta testers and letting them travel without anyone behind the wheel. Speaking at the Google I/O developer conference, CEO John Krafcik told attendees that the service (for people to try out the free cab ride) would go live this year. In the meantime, expect the Waymo service rollout to be limited to states like Arizona and Nevada.
China’s giant search company, Baidu, is relatively news to the world of autonomous vehicles but they are pretty familiar with AI technologies. Their self-drive offering — Apollo 1.0 was released in July, and Baidu started testing Apollo-running cars on public roads in late 2017. That’s quick progress compared to its European competitors.
It’s a refreshing attitude to see. Jingao Wang, senior director of Baidu’s intelligent driving group and the head of Apollo in an interview with a media channel said that he’s taking the “open-source approach” because according to him autonomous vehicles are simply an AI-based technology that needs a huge amount of data to thrive.
Good vibes from Baidu for sure but let’s see how that hold true after Tenscent and Alibaba, who have also entered the playing field, fare in the Asian market.
The self-driving startup nuTonomy is owned by Aptiv Plc. There isn’t much news here but there’s a tremendous amount of hope. Their technology nuCore: A Scalable, Full-stack Software Solution for Automated Driving is what is making people most excited. nuTonomy’s nuCore™ software is a flexible, modular system for perception, mapping, localization, motion planning, decision making and control of passenger vehicles operating in complex, urban environments. nuCore’s powerful planning engine enables human-like maneuvering performance, and nuCore’s patented approach to decision making allows autonomous vehicles to handle even the most complex traffic scenarios. nuCore can be integrated with a wide range of vehicle types, sensor configurations, and ride-hailing applications, and has been successfully deployed on five different vehicle models across three continents.
BYTON, an innovator of smart, premium electric vehicles, announced a partnership with Aurora, a leading self-driving technology company.In the next two years, BYTON and Aurora will jointly conduct pilot deployment of Aurora’s L4 autonomous driving systems on BYTON vehicles. Additionally, BYTON and Aurora will explore the use of Aurora’s self-driving system in BYTON’s series production vehicles. According to the Society of Automotive Engineers, L4 autonomous vehicles can drive independently in most environments, with the expectations that humans may need or choose to drive in some conditions.
The big news from German company has been its collaboration with Deutsche Post DHL Group (DPDHL) to deploy driverless electric light trucks which will transport and deliver packages. The driverless trucks will deliver packages from a central point to the destination. In the interim, it is trained to accurately assess its environment for variables such as traffic conditions, parking spot identification and parking, and pedestrian behavior. The truck is powered by the ZF ProAI self-driving system, which is powered by the Nvidia DRIVE PX palm-size supercomputer, but it also includes sensors, cameras, LIDAR and radar that feed the data into the system.
Driver monitoring is therefore a critical part of the development process – and one where ZF has a solid understanding. At CES, where the world got its first glimpse of their offering, the company demonstrated the multi-faceted interaction between human and machine, with innovative concepts in the area of human-machine interface (HMI).
It’s almost impossible to talk about the future of transport without mentioning Elon Musk. Musk got some flak for defending this autopilot feature of their offering that was recently involved in an accident. He took to Twitter and talked about what he considered an unfair focus on mishaps rather than benefits of autonomous vehicles with the potential to make roads safer.
“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the (approximately) 40,000 people who died in US auto accidents alone in past year get almost no coverage,” Musk said in a tweet.
The technology that Tesla’s Autopilot uses includes 8 surround cameras provide 360 degrees of visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength that is able to see through heavy rain, fog, dust and even the car ahead.
Processing power is 40x. The is particularly to make sense of all of this data, a new onboard computer with over 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software. Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously, and on wavelengths that go far beyond the human senses.
Groupe Renault and Sanef launched a pilot project in June 2016 to study the behavior of autonomous Renault vehicles at toll barriers and in work zones. This experiment was conducted in Normandy on the A13 highway using the V2X connected infrastructure developed by Sanef.
Groupe Renault currently offers advanced driver assistance systems on its vehicles. These ADAS improve safety and act for the most part without human input, as is the case for automatic emergency braking (AEBS).They serve as a gateway to autonomous vehicles, even though they are initially only there to provide assistance to the driver, who remains in charge of the vehicle. With autonomous drive operative, the car “sees” the road and monitors what’s happening all around the car by means of an extensive array of sensors. The data input from the sensors is processed by multiple onboard “brains” in the form of embedded software that tells the car what to do, thus enabling the driver to safely hand the car over to its automatic driver, in Eyes-off/Hands-off mode.
Last week, General Motors announced that it would spend $100 Million to begin making production versions of its self-driving electric Chevy Bolts at two of its manufacturing facilities in Michigan. These cars are being built without a steering wheel. That sounds more than a little scary and we’re all wondering what happens if human intervention is required.
The cars, which GM dubs the “Cruise AV,” will be the automaker’s first production-ready vehicle built from the ground up to operate with no steering wheel, pedals, or manual controls.
There was big news from Peugeot automaker PSA Group last week when they announced a recent collaboration with AImotive, a startup focused on developing autonomous vehicle tech. They are testing it on highways in France. The pilot projects will involve self-driving at speeds ranging up to 80 mph, and will use AImotive sensor and compute hardware, and autonomous software, installed on a Citroen C4 Picasso test car.
AImotive, a Hungary-based self-driving startup, offers “full stack” AI-based autonomous driving software, and recently expanded to the U.S. with a new Mountain View office. The company’s goal is to produce a scalable self-driving solution that automakers can adopt regardless of their particular hardware stack and vehicle designs.
The Group’s autonomous car will be built around these concepts: easy-to-use technology for all: To accompany the driver with self-driving functions, the Groupe PSA designs interfaces that are simple to use and intuitive, so that drivers can interact comfortably with their cars, and are able to take the wheel at any time; an offer for all: built of different degrees of autonomy to respond to customers’ different expectations; a programme for all: deployed on the Groupe PSA’s three brands, Peugeot, Citroën and DS.
Last week, Chinese innovation company Didi Chuxing was given the go-ahead to start testing self-driving cars in California. The move came just after Uber was made to suspend its driverless car programme across North America, after March’s fatal collision with a pedestrian in Tempe, Arizona.
Didi is now the 53rd company to receive a permit to test autonomous vehicles in California, under the state’s Department of Motor Vehicles regulations. Didi opened its first Silicon Valley offices last year, focusing on artificial intelligence and security. The research facility in Mountain View, close to Google’s headquarters, now has about 100 staff.
Continental has been working with Nvidia, an equally exciting company to talk about, with the intent to build a full-scale, top-to-bottom autonomous driving system. They hope to bring their offering to market by 2021.
The Continental AV systems will work together using dedicated engineering teams provided by both sides, and with base technologies underlying the Continental offering that include Nvidia’s DRIVE Xavier system-on-a-chip, DRIVE OS and its DRIVE AV software offerings, the companies announced, and Continental will supply ASIL-D security certification expertise, as well as radar, camera and LiDAR sensor solutions.