All Sections

Nvidia’s car of the future revealed at CES, with deep learning auto pilot and so many touchscreens…

Nvidia is using a brand new super-chip to power the car of the future, packed with more Full HD touchscreen displays than ever before, plus a slick new deep learning auto-pilot system.

At its huge CES 2015 event in Las Vegas, Nvidia revealed its vision of the car cockpit of the near future, which is jam-packed with high-definition screens giving real-time feedback along with voice activation, direct controls and personalised interior design.

To power all of this tech you’ll need a pretty beefy computer, and that’s where Nvidia comes in. Its new Drive CX digital cockpit computer, powered by the brand new Tegra X1 mobile processor, can run four Full HD displays all on its own. These screens can display all kinds of ADAS info, including fully 3D parking assistants, rich 3D navigation, blind spot detection, lane change assistants and even self-parking.

These displays could even be used to replace your physical dials and feedback devices. And as they’re fully digital, you’ll be able to customise their look for a more personal feel to your car’s interior.

For instance, you slide into your car and a dashboard cam scans your face and converts your dashboard, giving it a sleek aluminium finish. Your wife gets in and the dashboard is updated with a wooden finish, just the way she likes.

But Nvidia had more to reveal at CES 2015, in the form of its new deep learning auto-pilot system.

The Nvidia Drive PX auto-pilot car computer uses dual Tegra X1 processors, giving 2.3 Teraflops of total power and allowing a dozen diferent camera inputs for serious accuracy. But the meat of the Drive PX is the deep learning neural network, which is the brain of the system.

This auto-pilot can apparently recognise images better than most humans, so for instance it can pick out a person, sheep or whatever else may be stood in the road with impressive accuracy. It does this using a massive database of fragments of images – e.g. the neural network could pick out an Audi car by recognising just a few of its component parts, from the tyres to the badge to the chav behind the wheel.

The Drive PX picks up hazards in real-time

The Drive PX can even adapt to all kinds of situations. For instance, if it detects a school bus with its lights flashing just up ahead, the system knows to use caution as there may be kiddies about to run out in front of the car.

Those aren’t the only kinds of hazards the Drive PX can pick up, though. We saw a demo of the auto-pilot system identifying speed cameras, warning signs (e.g. queues ahead), appropriate traffic lights and so on.

And even better, if the Drive PX detects a cop car close behind it can warn you to turn on your lights, slow down or do anything else that might prevent an unpleasant encounter.

While all that hardly means we’ll be getting some shut-eye as our car drives us to work in the imminent future, it does mean we can expect much better hazard warnings and built-in safety features from our future vehicles.

Comments