Enhancing road sensing beyond visual with smart data processing




For the last few years, the automotive industry has been grappling with the fact that inclement weather conditions pose a significant challenge for self-driving cars. To combat this issue, academic researchers and automotive companies have been working on ways to enable self-driving cars to safely navigate bad weather.

However, most, if not all of these solutions have been based on visual-based technologies – such as equipping vehicles with a new kind of radar, lidar or even cameras. Tactile Mobility has developed a software stack with artificial intelligence that enables ordinary and self-driving vehicles to manage data from non-visual sensors into information about the road surface and the vehicle itself, in order to respond appropriately to road conditions. The BMW Group is currently implementing the technology in their next generation vehicles, while OEMs like Porsche have plans to implement in the near future.

Eitan Grosbard

In an interview with embedded.com,Eitan Grosbard, VP of business development of Tactile Mobility, believes that the automotive industry is doing everything in its power to ensure driver safety, and that it can only be better accomplished by going beyond visual sensing, to include tactile sensing and data.

The company’s software collects data using exisiting sensors embedded in vehicles and then analyzes it to create actionable insights in real-time, providing an accurate description of the state of the road, the vehicle and the vehicle-road dynamics. “Our data services are based on the Tactile Mobility technology platform, which includes two advanced software modules, vehicle-embedded software and a cloud module which can be delivered as standalone solutions or combined to reinforce each other,” Grosbard commented.

Autonomous vehicles and big data

The real economic potential of self-driving mobility would lie in data collection and management, analysis of driving style, fuel consumption and predictive maintenance. Today, in-vehicle telematics has become a valuable ally for large companies but also for the car fleets of small businesses. The gradual spread of black boxes, increasingly installed by manufacturers or long-term rental companies, is contributing to the creation of “big data” from vehicles. Data that, if analyzed correctly and in advance, allow improved services and create solutions tailored to a fleet or a driver.

Self-driving vehicles require continuous monitoring of the conditions of the on-board systems and the environment in which they are located: they are equipped with highly sophisticated sensors (radar, lidar, cameras, and so on) that constantly collect information to perform autonomous operations. From better grip estimation to shorter braking distances or from optimizing adaptive control to active suspension management systems, new automotive sensor technology is transforming a vehicle’s scope for the better.

This requires a mass of both hardware and software computational systems that are able to process large amounts of data, transmit it to the sensors, and enable the training of artificial intelligence (AI) and deep learning algorithms in real-time directly in the vehicle. The safety and user experience benefits have improved in recent years thanks in part to the computational expertise of the latest generation of ECUs. Tactile Mobility is producing a signal processing capability that eliminates noise and exposes hidden data, optimizing vehicle guidance.

Tactile Mobility: signal processing of virtual sensors

Tactile Mobility’s technology is continuously updated and consists not only of the software module integrated in the ECU of various vehicles, but also of a cloud that processes the data coming from these vehicles. This software captures data from multiple non-visual sensors, enriching it with additional data from weather conditions. Uploaded to the cloud, these data sets are processed using machine learning and big data methodologies.

Figure 1: From data to powerful insights (Image: Tactile Mobility)

“We collect data from sensors. So what we’re doing is signal processing of virtual sensors. For example, one of the things that we create is grip estimation: slipperiness detection or the way that the vehicle holds the road. Our advanced signal processing and data normalization methodologies address the challenges of collection and representation. Measurements eliminate the effects of vehicle type variety and specific model configuration, producing information that is agnostic to a particular vehicle traveling on a specific roadway,” said Grosbard.

He continued, “We’re currently working with six vehicle manufacturers, including BMW, Porsche, and others under NDAs. Our processor collects data from existing in-vehicle sensors – with no additional hardware required – such as the wheel speed of all four wheels, engine torque, RPM and so on,” said Grosbard.

Data on each vehicle’s engine efficiency, braking efficiency, tire health, weight, fuel consumption and more are identified as VehicleDNA. Pattern data relative to the road, such as slopes, curvature, normalized grip levels and the location of hazards such as bumps, cracks and potholes, on the other hand, are identified as SurfaceDNAT. The data is downloaded to vehicles driving in the specific area, improving safety and user reaction time.

Tactile Mobility’s software technology will be embedded in the BMW Group’s next-generation vehicles on a global scale beginning in 2021, in order to help improve the driving experience and identifying road conditions.


Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

The post Enhancing road sensing beyond visual with smart data processing appeared first on Embedded.com.





Original article: Enhancing road sensing beyond visual with smart data processing
Author: Nitin Dahad