In the vast roadmap of advanced driving technology presently being developed, ZF Friedrichshafen AG is one of the notable players. Known for its transmission systems, ZF is in process of developing an entire array of in-house, open source technology that can be retrofitted or equipped in vehicles right at the core. The entire structure of development now is modular, which forms the basis of making technology for the entire industry. The need for a unified platform is tantamount, and ZF’s individual component development is proof of it.
With the software development being done on an open source base, ZF showed us a modular view of all the components in what would one day come together as cohesive ADAS (advanced driver assist system) units. Grossly dividing the entire array of technology, we have two major classifications - mechanical and advanced driving.
The mechanical components by ZF include mSTARS (modular Semi-Trailing Arm Rear Suspension) axle system with integrated 150kW (201bhp) electric drive system, Active Kinematic Control (AKC) rear axle steering, electrically powered belt drive steering, Integrated Brake Control (IBC), Adaptable Haptic Control (AHC), reworked Active Roll Stabilisation (ARS), Continuous Damping Control (CDC), haptic-enabled seat belt safety system, and Integrated Chassis Control (ICC).
Furthermore, ZF’s advanced driving components are centered around ZF ProAI, which the company claims is the world’s first market-ready artificial intelligence electronic control unit (ECU) built in association with Nvidia. The algorithms binding together the central ECU, all the individual microcontrollers, Astyx-sourced high frequency radars, MobileEye’s EyeQ-powered imaging equipment and the aforementioned mechanical elements are being developed in-house. The core algorithms here are open source, and ZF aims uniform market application to help OEMs adapt to the new technology.
The ZF ProAI was first unveiled at CES 2017 - the same time as the company announced its partnership with Nvidia. The company uses the Nvidia Drive PX 2 AI driving computing platform at the heart of its advanced vehicles, and it is from here that every single adaptive and autonomous element will be controlled.
The ZF ProAI is also the first among the automotive industry’s Tier I suppliers to be close to production-ready status. The AI-powered ECU uses the AutoCruise configuration of the Nvidia Drive PX 2, which is powered by the Nvidia Parker SoC. This SoC uses a hexa-core architecture and the Nvidia Pascal GPU, along with 128-bit LPDDR4 memory, 4K60p video encoders/decoders, a Gigabit Ethernet MAC and dedicated ISPs and I/Os to present a formidable in-car computer.
Employing the single-core edition of the Drive PX 2, ZF ProAI uses deep learning algorithms and collates information from multiple cameras, radars, lidars and ultrasonic sensors to learn while on the go. It applies sensor fusion to bring in all the information, which is then used by the ICC to instruct other mechanical components - CDC, AHC, IBC and ARS (acronyms mentioned above).
So, to simplify, the entire array of sensors around a vehicle picks up on-road information (road signs, speed warnings, obstacles in proximity, speed and trajectory of vehicles around, etc.) and feeds all of it to the central ECU - the ZF ProAI. It is here that the information is decoded by onboard ISPs and decoders with the help of the processor cores. The algorithms then come into effect to decide how the vehicle will be controlled and its immediate next step of action, following which the microcontroller residing in the chassis instructs the individual components. For instance, if a vehicle suddenly slows down in front or a pedestrian comes in the way, the car will automatically use the integrated brake control to reduce speed, and simultaneously, the steering wheel will gauge the safest way around while taking into consideration the surroundings, and steer the car away from the obstacle. Depending on whether or not is necessary, the car can also halt itself.
The ZF ProAI also reads information from an HD map to gauge the route ahead. Alongside the ADAS implications, ProAI can also implement swarm intelligence by using the car’s onboard telematics to “talk” to other similarly equipped vehicles on the road. It can use data downloaded from cloud servers to gauge traffic, learn about road controllers and apply AI algorithms to learn regular routes and conditions, thereby being able to instruct the individual elements in a car more effectively. This entire AI engine is slated for series production in 2018, and is already implemented in prototype stages for road tests.
The other mechanical elements
While all of these form the core of an advanced vehicle, there are a number of other advanced elements too that contribute to safety in an advanced vehicle. These include side cameras that enable ‘Wrong Way Inhibition’, an interior camera to detect driver fatigue and distraction, active control retractor, active buckle lifter, and electromechanical roll control.
Reading into road signs, wrong way inhibitor uses the central infotainment display and in-car audio to alert a driver as soon as the indicator for a wrong turn is switched on. If the driver refuses to stop, the algorithms take charge of the steering wheel and brakes and stop the car halfway into the one-way turn that you wrongly entered. If, by any chance, there is a need to go ahead, it can be overridden by the driver.
The advanced driver fatigue and distraction detection system uses an always-on camera that does facial deconstruction to detect moods, gestures and the possible level of attention that a driver is paying. While we did not get to really ‘see’ this technology in action, ZF says that it can detect distraction and alert a driver within seconds, thereby avoiding potential accidents. The active control retractor works in seamless integration with this, where if you seem incapable to drive, the car’s processor will detect by itself and take control over to the algorithms. If the roads are not navigable, the car will reach a safe area on the road, slow down and come to a halt. Using connectivity means this information will be fed to the servers, and any potential threat situation can be suitably alerted.
The active buckle lifter is essentially the adaptive seat belts by ZF that are already in production. It uses a seat belt buckle that lifts automatically as you sit inside. Following this, you strap on the belt and it adapts automatically to the seat position and your sitting style. It also provides haptic feedback to affirm that you are seated safely. In case of urgent maneuvering, the active buckle and adaptable haptic control automatically tightens the seat belt to prevent any injury. The experience is a bit jarring at first, and the technology may still need more refinement to not adjust the belt with abrupt jerks.
Electromechanical roll control uses an electronic microcontroller to judge trajectory along the three axes to inform the chassis control about the possible balancing requirement, following which the car holds the interior steady even in fast turns.
What does all of this mean?
Despite such wizardry, all of this is only good for achieving Level 3 of autonomous driving as certified by the Society of Automotive Engineers. The technology at work here is mostly assistive in nature, and while they do work in a considerable level of autonomy, it still requires human intervention in certain key situations.
It is this level that most cars will be reaching by 2020. Level 4, as the certification reads, require a human to mostly be a passenger apart from exceptional circumstances, which would require more computing power and advanced sensors. As far as present innovation goes, ZF is making steady progress to hold its status as a prime automotive supplier