CES 2017 So the stakes are high, and many companies are already playing hard. There were important announcements from two of the most established chip suppliers in the auto sector, NXP and Nvidia – the former particularly closely watched because it will soon be part of Qualcomm.
Its launches reminded the industry how strategic an acquisition it will be, bringing significant depth and breadth to its new parent’s auto offering, surrounding the core Snapdragon modem/processor with many other car-related chips, and adding valuable connections with the car industry and its channels.
NXP adds to Qualcomm’s dowry
The main CES launch by NXP was a software defined radio (SDR) solution for IVI systems called the SAF4000. This claims to be the world’s first one-chip system covering all global audio broadcast standards, including AM/FM, DAB+, DRM(+) and HD. The primary benefit is to slash hardware costs, as car radio systems generally require bulky circuit boards and six separate integrated circuits – whereas the SAF4000 is the size of just a fingertip. Other companies provide a single RF front end to all these standards, but this is the first to deliver an entire system-on-chip (SoC), rather than offloading some of the processing to a device CPU.
This is part of NXP’s wider push to simplify the connected car device ecosystem by replacing multichip systems with compact RFCMOS chips, which will drive down costs and encourage mass uptake. That will complement Qualcomm’s existing presence in some Android Auto IVI systems, as well as its constant drive towards greater integration.
“The step from multichip discrete to a one-chip RFCMOS solution is a true quantum leap for the industry,” said Torsten Lehmann, general manager Car Infotainment and Driver Assistance at NXP. “One-chip means a significantly smaller hardware footprint, easier development and simple worldwide integration with the ability to receive multiple different digital, as well as analog, radio standards and to switch between them via software. Centralized infotainment head units, distributed radio/audio systems, as well as smart antenna solutions are perfectly supported by this new one-chip family.”
Qualcomm’s $47bn acquisition of NXP Semiconductors, once it closes, will push the US firm into lead place in the auto chip market and third place in the overall global semiconductor industry. NXP says its radio designs are in 19 of the top 20 Tier 1 automotive offerings, adding up to over 500m car radio deployments in the past 10 years. It also claims that its i.MX multimedia app processors have been deployed in over 65m vehicles.
Qualcomm launches Drive Data
Qualcomm itself, which unveiled an automotive variant of its Snapdragon SoC (the 820A) at CES 2016, followed up this year with new X16 LTE modems for next-gen auto applications; a customer win in Volkswagen IVI systems for the updated Snapdragon 820Am; and a partnership with Panasonic to develop a next-gen Android-based IVI system for vehicles, using the Snapdragon 820Am running Android 7.0. The Panasonic concept was on display at CES.
More important to the US chip giant was the launch of the Drive Data Platform, which is Qualcomm’s brand for its sensor-fusion platform for collating and acting on all the sensor inputs that are generated by a vehicle, ranging from cameras to radar and lidar.
The Drive Data Platform is based on the Snapdragon 820Am, and will make extensive use of the Snapdragon Neural Processing Engine (SNPE), to use machine learning and AI techniques to enhance its data processing abilities – allowing the car to identify what it sees via those sensors. More details are due to be announced after CES, but this sounds like the first step down a very significant road for Qualcomm.
Drive Data highlights the fact that chip giants need to provide full platforms, not just silicon, in order to dominate the IoT. With Nvidia well ahead of the game in supporting intelligent and driverless cars via an AI-driven big data system, Drive Data states Qualcomm’s intention of going up against that rival, as well as Intel, in the intelligent car data platform game – not to mention other contenders from outside the chip industry, such as Google, Tesla, Baidu and Uber.
Audi divides its favours
Nvidia itself is already a powerhouse in the connected car, and is particularly looking to harness its GPU-based computational/AI platform to power the driverless vehicle. At CES, is touted new partnerships with Here, Audi, ZF and Zenrin as stepping stones to auto dominance.
Audi was spreading its favors between Qualcomm and Nvidia. With the latter, it announced a collaboration to put the world’s most advanced AI-driven car on the road by 2020 – a claim that has probably irked Google’s newly minted Waymo division (see inset). Before 2020, Audi will be using Nvidia’s Drive PX 2 platform and DriveWorks software.
The partners had an Audi Q7 SUV at CES, running demos of in which passengers were driven around while sitting in the backseats, with no driver. Nvidia’s PilotNet deep neural networks are the basis for the car’s ability to see and understand the driving environment, providing machine vision, object recognition and allowing the Q7 to adapt to the demo’s changing road surfaces and lane markings, with a simulated construction zone detour.
Car makers extend their wireless partnerships
- Fiat-Chrysler (FCA) and Alphabet’s new Waymo division announced a partnership to work on an Android-based reboot of FCA’s Uconnect operating system – possibly the first iteration of a complete version of Android for vehicles.
- FCA also showed off its new concept autonomous vehicle, Portal, at CES. It has worked with the Panasonic Automotive Advanced Engineering team to create a user experience with facial recognition and voice biometric software, to identify who is in the vehicle at any one point to customize the environment for them with music, lighting, vehicle temperature, heated or cooled seats and so on. The experience is powered by Panasonic’s Cognitive Infotainment (PCI) platform, which will be trying to eat into an AI-driven market currently led by Google and IBM.
- Ford is integrating its Sync IVI system with Samsung Gear S2 and S3, so that wearable owners can be given notifications, such as parking alerts, by their car.
- Ford has also implemented high speed in-car WI-Fi hotspots into its 2017 models, powered by AT&T’s LTE service. The Sync Connect system can allow up to 10 connected devices in any one car.
- Mercedes will integrate Google Assistant into its 2017 models and expects to see customers communicate with their cars through Google Home, in a similar way to Ford’s Amazon Alexa-based system (see separate item).
Nvidia and Audi first graced CES together some seven years ago, in a partnership that gave birth to the latter’s MMI navigation system and virtual cockpit, powered by Nvidia’s GPUs.
The latest announcement sees Audi promise to roll out its new A8 sedan, which it claims will be the world’s first Level Three automated vehicle. This will allow drivers to turn their attention away from the road and allow the system to control the car and monitor the environment, with the expectation that the human driver will respond only to a request to intervene.
Audi’s zFAS system is the heart of its autonomous vehicles, and is built around Nvidia silicon (though there is speculation that the ties may be loosening because of Intel’s MobilEye deal—see below). For the new A8 car, the zFAS platform will be using Nvidia’s Tegra SoC and software, but it’s noteworthy that the Q7 that is being demonstrated is using the entire Drive PX 2 system, and not just a chip and code.
Other Nvidia moves at CES
Also on the self-driving front, Nvidia said that ZF, a major provider of equipment to the commercial vehicle and trucking industry, is the first top tier auto supplier to make a self-driving computer based on the Drive PX 2 commercially available.
Called the ZF ProAI, the unit uses the Drive PX 2’s AutoCruise configuration, which should allow companies to build self-driving vehicles, ranging from trucks to warehouse forklifts. The unit processes inputs from multiple sensors, including cameras, lidar, radar, and ultrasonic sensors, which help the vehicles locate themselves locally on a map of the wider world.
And speaking of world maps, Nvidia also announced a deal with mapping firm Here, once part of Nokia and now owned by a consortium of German auto makers, to expand the capabilities of cars powered by Nvidia tech. The new partnership sees Here use Nvidia’s MapWorks AI tech to improve its HD Live Map, with Nvidia developing a localization technology for HD Live Map as part of its DriveWorks software – which will allow automakers using the Drive PX2 computer to integrate the Here localization. The third part of the agreement sees the pair collaborate on incorporating real-time updates into the IVI systems, so that occupants can see the changes dynamically.
“HD maps are essential for self-driving cars,” said Jen-Hsu Huang, Nvidia’s CEO. “Here’s adoption of our deep learning technology for their cloud-to-car mapping system will accelerate automakers’ ability to deploy self-driving vehicles.”
Despite the Here deal, Nvidia also announced a deal with Zenrin, a Japanese mapping company, to jointly develop another cloud-to-car system for self-driving cars, with the Drive PX 2 and DriveWorks software being paired with Zenrin’s mapping platform – in a pretty similar arrangement to Here’s.
“Zenrin’s big data includes road images and point-cloud data captured by mapping-survey vehicles,” said Koji Haraguchi, head of R&D at the mapping firm. “Combining Nvidia’s AI technologies and Zenrin’s big data will enable us to provide wider coverage of HD maps to automotive manufacturers with a dramatically shorter lead time.”
Intel takes 15% stake in Here
If Nvidia was not being exclusive, neither was Here, which has also got far closer to Intel since the holiday. This is symptomatic of more than just a battle with Nvidia for a particular platform, however important – Intel wants to undermine its rival’s entire GPUbased approach to AI and big data platforms, whether for connected cars, the IoT or enterprise cloud.
If it can push its own architectures - which are focused on microprocessors, specialized AI chips and FPGAs rather than GPUs - as superior solutions, it could weaken the whole Nvidia philosophy in key areas of deep learning, such as autonomous vehicles. There is a long way to go with that strategy, since the GPU-based approach has so far proved to be a powerful and relatively simple one for high end AI-driven platforms.
But Intel is putting many of the pieces in place to attack Nvidia’s areas of strength. This was the thinking behind the recent acquisition of Nervana, which had already mounted a credible challenge to Nvidia’s CUDA software, which powers its GPUs, with its own Neon cloud service, and is set to launch a deep learning accelerator chip on the drawing board, and set to launch next year. Intel has also acquired Movidius to expand its machine vision capabilities, and now taken a stake in Here.
The companies which acquired Here from Nokia - Audi, BMW and Daimler – have reduced their one-third stakes to 25% each, selling 15% of the total to Intel and another 10% to a group of Chinese companies (Tencent, NavInfo and GIC).
Intel and Here say they plan to develop a “highly scalable proof-of-concept architecture that supports real time updates of high-definition maps for highly and fully automated driving as well as explore opportunities in IoT and machine learning.”
For Intel, this is a fairly straightforward way for it to secure the mapping and locational data that it needs to power its automotive ambitions, while adding a lot of contextual richness to the data sets. With a little dot-joining, one can spot patterns leading to the deal, such as Intel’s previous partnerships with Mobileye and BMW to build a fleet of fully autonomous cars by 2021.
Indeed, at CES, Here announced its own deal with Mobileye – another of the prominent companies making specialised, optimised chips for AI and IoT functions, and so potentially strengthening Intel’s platform against Nvidia’s.
The importance of the Mobileye deal
Intel’s partnership with Mobileye, augmented by a new deal last fall with Delphi, is so important in the connected car world that some are calling it the Wintel of that market, because it could combine the smaller firm’s excellent and specialized vision algorithms with a more advanced processor roadmap than its own.
Daniel Galves, Mobileye’s chief communications officer, recently told EETimes about the possible division of labor between its own technology and Intel’s, saying: “The Mobileye SoC will run all sensor processing software (8-camera surround view by Mobileye, radar/lidar processing by Delphi), localization mapping by Mobileye REM, and sensor fusion will run. On the Intel SoC, all driving policy (reinforcement learning algorithms for path strategy by Mobileye) and driving control (driving behavior software by Delphi’s Ottomatika) will run.”
That is before Intel has even unveiled its expected automotive system- on-chip, based on multiple Xeon cores, which could combine with Mobileye’s EyeQ chip, which supports computer vision in the ADAS (advanced driver assistance systems) which Mobileye dominates.
There has already been speculation that Intel has struck a real blow at Nvidia with its Mobileye and other partnerships. Nvidia was at the heart of the Audi zFAS – a centralized assistance controller the German carmaker is developing with Delphi and Mobileye, which also uses FPGAs from Altera, now part of Intel.
If Altera will open doors for Intel in some unfamiliar markets, supporting optimized coprocessing for tasks with which standard x86 CPUs struggle, that will also squeeze some other chipmakers with a longer history in machine-to-machine and automotive. Mobileye, for instance, has traditionally worked with STMicro, but may now be hedging its bets with its Intel partnership.
And now Here is part of the web too, announcing its own deal, to integrate data from Mobileye’s automotive machine vision sensors into Here’s Open Location Platform (OLP), using Mobileye’s Global Roadbook (GLRB) as a data layer. This is designed to increase the accuracy of the OLP, by adding contextually up-to-date landmark and roadway information pulled from vehicles using Mobileye’s sensors – the eyes of many of the most advanced semi-autonomous cars on the market. These vehicles will be piping the Road Segment Data (RSD) generated by Mobileye’s REM technology to the Here cloud, which will then be able to be shared with other vehicles using Here’s Live Map platform.
As Here’s CEO, Edzard Overbeek, put it: “A real time self-healing and HD representation of the physical world is critical for autonomous driving, and achieving this will require significantly more powerful and capable in-vehicle compute platforms. As a premier silicon provider, Intel can help accelerate Here’s ambitions in this area, but support the creation of a universal, always up-to-date digital location platform that spans the vehicle, the cloud, and everything else connected.”
This is the latest step in Intel’s automotive transition, which has seen the division emerge from the IoT wing in the past year or so. That division is now getting its own brand, called Intel Go, which will revolve around new developer kits for automakers, based on its Atom and Xeon processors.
Under the Go brand, Intel said at CES that it was launching two new developer kits for auto makers looking to add a brain to their vehicles. With a variant based around the low-power Atom processor, and another with a far more powerful Xeon backed up by new Arria 10 FPGAs, Intel is pitching the designs as the answer to all automotive computational needs – for sensor fusion to decision making.
The new Intel Go Automotive SDK is Intel’s answer to the software side of the picture, while the Go Automotive 5G Platform aims to provide the V2X communication requirements, and to be the first commercial ‘5G’ world modem (see separate item). The new Go kits are going to be rolled out in a fleet of 40 BMW cars this year, as part of the test platform for the fleet with Mobileye and Intel.
Towards the 5G car
Of course, nobody could escape that ‘5G’ label, even in the car sector where 4G standards for wireless connectivity are not yet fully supported – last week saw tests based on the new LTE-V specifications, but there are all kinds of disputes between the 3GPP approach, the Wi-Fi community’s 802.11p and the DSRC specialized protocol. More of that potential industry split in an upcoming issue, but we can be sure that 5G, for all the hopes that it will prove to be an umbrella for many connectivity specs, will not really manage to neutralize all the politics behind these varying standards.
But as long as 5G is not commercially real, participants from the car and wireless industries can keep their visions alive. For instance, PSA Group, owner of Peugeot and Citroёn, has partnered with Orange and Ericsson to work on "Towards 5G" car platforms based for now on the emerging LTE-V standard and on virtualization, but with a view to creating a roadmap towards 5G-based Intelligent Transport Systems (ITS) to support smart and autonomous vehicles.
Their pilot project sits outside similar efforts by the Qualcomm-driven 5G Automotive Association (5GAA), although Ericsson is part of both efforts. Orange will provide the spectrum and cellular network for the trial sites; Ericsson the radio and distributed, virtualized core network; and PSA will define the requirements, scope and user experience and provide technical validation.
“Connected IoT services are a crucial way to enhance the user experience for our customers, who today demand unprecedented levels of comfort and convenience as well as personalized services in their vehicles,” said Carla Gohin, VP of research, innovation and advanced technologies at PSA.
Copyright © 2016, Wireless Watch
Wireless Watch is published by Rethink Research, a London-based IT publishing and consulting firm. This weekly newsletter delivers in-depth analysis and market research of mobile and wireless for business. Subscription details are here.