The 2015 Consumer Electronics Show has opened, and we start our Day One coverage with few introductions of advanced driver assist systems (ADAS), integrated infotainment technologies and interesting cloud-based data processing platforms that are worthy of attention during the four-day event.
The company introduced two in-car supercomputer processors equipped with advanced capabilities for machine learning, visual computing and next generation digital cockpit visualization. The DRIVE PX is an autonomous vehicle platform engineered to be more powerful than the world’s fastest supercomputer only 15 years ago, while the DRIVE CX is NVIDIA’s latest digital cockpit computer that runs all infotainment and graphics processing.
The DRIVE PX platform utilizes two of NVIDIA’s new Tegra X1 automotive-grade chips to deliver over one teraflop of processing power and has the ability to handle up to 12 high-resolution camera inputs, providing the means for sensing objects and specific images, which enables a vehicle to move by itself and learn subtle details of its surroundings in a manner similar to that of humans.
The on-board supercomputer studies objects, compares them to images transmitted by other cars, identifies and archives objects in the vehicle’s “library” and thus reduces the likelihood of encountering unknown objects. This is a significant advancement over the current generation of processors, which struggle to identify the differences between inert objects and vehicles, pedestrians, animals and cyclists.
The DRIVE CX system also uses NVIDIA’s Tegra processors to power up to 16.8 million pixels on multiple displays which will replicate a vehicle’s instrument cluster. Surround-Vision provides a top-down, 360-degree view of the vehicle in real time and not only eliminates blind spots but can completely replace conventional rear-view mirrors.
The two processors are expected to be made available by mid-2015.
Seamless integration of ADAS with a new infotainment platform was announced by QNX Software Systems, a subsidiary of Blackberry LTD.
The updated ADAS integrates data from multiple sensors such as camera, ultrasonic and LiDAR, and utilizes an advanced digital instrument cluster. An innovative LED-based display allows drivers to intuitively gauge the direction and proximity of objects to the front, rear, and sides of the vehicle, without having to take their eyes off the road. Stretching the width of the dash, the display integrates input from the car’s ultrasonic and LiDAR sensors to provide a centralized view of ADAS warnings. The QNX concept team has also transformed the car’s rear- and side-view mirrors into video displays that offer a complete view of the scene behind and to the sides of the vehicle, with side-view displays providing color-coded alerts to warn of cars or other objects within the vehicle’s blind spots.
A new infotainment system based on the QNX CAR platform was also introduced, which incorporates speech recognition and natural language processing using Nuance’s Dragon Drive VR engine. The system will detect when the vehicle is moving out of range of AM and FM radio signal strength and automatically switch to the corresponding iHeartRadio station to continue listening in the app’s digital format.
Of special interest is an in-booth IoT demonstration of a cloud-based platform by parent company Blackberry. Highlights include remote software troubleshooting and performance optimization, app management, OTA software downloads and smartphone control of various vehicle functions. A second demonstration for fleet asset tracking shows the location of thousands of truck containers, shipping history and the display of specific trip segments.