Tesla to stop killing drivers: Software update beamed to leccy cars

'Autopilot' adds radar and driver prods


Tesla is changing how its "Autopilot" super-cruise-control works in response to the death of one of its customers.

The over-the-air software update will be automatically applied to the electric cars this month and will expand the use of radar sensors to decide whether a crash is likely to occur.

In a blog post on the Tesla website, the company explains that the radar sensors added to cars in October 2014 were initially intended as a "supplementary sensor to the primary camera and image processing system." But they will now be used to determine car actions without requiring the camera to confirm the existence of an object.

Radar is not a perfect tool to see what is going on around a car, Tesla's techies explain. People appear partially translucent and wood and painted plastic are effectively invisible. In addition, metal objects can appear much larger than they really are from one radar sensor's perspective due to metal reflecting the signal. That could result in a car slamming on the brakes to avoid nothing larger than a soda can, the company warned.

Tesla's solution is to combine the input from all radar sensors every tenth of a second to build a more reliable 3D picture of the world outside the car.

Why the change? Because it was the car's failure to distinguish between a white trailer and the bright sky behind it that led to 40-year-old Joshua Brown being killed in his 2015 Tesla Model S while driving in Florida in May.

The car was using the optional autopilot system, and both he and the computer failed to spot the 18-wheel tractor trailer. The car went under the trailer, slicing the top of the car off and killing Brown.

A preliminary report from the US National Transportation Safety Board (NTSB) in July said Brown was traveling at 74mph when he hit the trailer – the speed limit on the road is 65mph.

Height right

There was another factor initially mentioned by Tesla and covered in the new update – but it was not picked up as much as the failure to distinguish the sky from the trailer was.

Back in June, Tesla said: "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield."

In its post, the company alludes to this when it says: "When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake."

The implication being that perhaps Brown's car did notice something ahead of it but calculated that impact was unlikely. (Of course, if the US passed a law requiring "Mansfield bars" (underride guard hanging from rear of trailer) on the sides of trucks as well as the back, the death could most likely have been avoided altogether.)

Tesla's answer to the height issue is to use its network of cars to build up a database of road signs, bridges and other similar objects using radar to build up a blueprint. The car can then compare that blueprint with the real world as a car travels along the road. If it sees something out of the ordinary, it is far more likely to be a possible obstruction.

"If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist," the company notes.

"The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions," the post smirks.

Tesla even claims that it will be able to see objects in front of the car ahead of you by bouncing radar off the underside of the car and recognizing them, thanks to "the radar pulse signature and photon time of flight." The idea is that your car will brake to avoid something even if the car in front doesn't (such as in heavy fog).

Why doesn't Telsa use the "lidar" system that other self-driving competitors like Google use? Lidar is basically the same idea as radar but using lasers rather than radio waves. Musk claims it's because lidar doesn't offer the same capabilities. Although it may also be because all recent Teslas have radio-wave emitters and receivers, and introducing lasers would require a significant re-engineering effort.

Similar topics

Broader topics


Other stories you might like

  • New audio server Pipewire coming to next version of Ubuntu
    What does that mean? Better latency and a replacement for PulseAudio

    The next release of Ubuntu, version 22.10 and codenamed Kinetic Kudu, will switch audio servers to the relatively new PipeWire.

    Don't panic. As J M Barrie said: "All of this has happened before, and it will all happen again." Fedora switched to PipeWire in version 34, over a year ago now. Users who aren't pro-level creators or editors of sound and music on Ubuntu may not notice the planned change.

    Currently, most editions of Ubuntu use the PulseAudio server, which it adopted in version 8.04 Hardy Heron, the company's second LTS release. (The Ubuntu Studio edition uses JACK instead.) Fedora 8 also switched to PulseAudio. Before PulseAudio became the standard, many distros used ESD, the Enlightened Sound Daemon, which came out of the Enlightenment project, best known for its desktop.

    Continue reading
  • VMware claims 'bare-metal' performance on virtualized GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual updates across CPU, GPU, and DPU lines
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading
  • AWS puts latest homebrew ‘Graviton 3’ Arm CPU in production
    Just one instance type for now, but cheaper than third-gen Xeons or EPYCs

    Amazon Web Services has made its latest homebrew CPU, the Graviton3, available to rent in its Elastic Compute Cloud (EC2) infrastructure-as-a-service offering.

    The cloud colossus launched Graviton3 at its late 2021 re:Invent conference, revealing that the 55-billion-transistor device includes 64 cores, runs at 2.6GHz clock speed, can address DDR5 RAM and 300GB/sec max memory bandwidth, and employs 256-bit Scalable Vector Extensions.

    The chips were offered as a tech preview to select customers. And on Monday, AWS made them available to all comers in a single instance type named C7g.

    Continue reading
  • Beijing reverses ban on tech companies listing offshore
    Announcement comes as Chinese ride-hailing DiDi Chuxing delists from NYSE under pressure

    The Chinese government has announced that it will again allow "platform companies" – Beijing's term for tech giants – to list on overseas stock markets, marking a loosening of restrictions on the sector.

    "Platform companies will be encouraged to list on domestic and overseas markets in accordance with laws and regulations," announced premier Li Keqiang at an executive meeting of China's State Council – a body akin to cabinet in the USA or parliamentary democracies.

    The statement comes a week after vice premier Liu He advocated technology and government cooperation and a digital economy that supports an opening to "the outside world" to around 100 members of the Chinese People's Political Consultative Congress (CPPCC).

    Continue reading
  • Nvidia teases server designs for Grace-Hopper Superchips
    x86 still 'very important' we're told as lid lifted on Arm-based kit

    Computex Nvidia's Grace CPU and Hopper Superchips will make their first appearance early next year in systems that'll be based on reference servers unveiled at Computex 2022 this week.

    It's hoped these Arm-compatible HGX-series designs will be used to build computer systems that power what Nvidia believes will be a "half trillion dollar" market of machine learning, digital-twin simulation, and cloud gaming applications.

    "This transformation requires us to reimagine the datacenter at every level, from hardware to software from chips to infrastructure to systems," Paresh Kharya, senior director of product management and marketing at Nvidia, said during a press briefing.

    Continue reading
  • Nvidia brings liquid cooling to A100 PCIe GPU cards for ‘greener’ datacenters
    For those who want to give their racks an air cut

    Nvidia's GPUs are becoming increasingly more power hungry, so the US giant is hoping to make datacenters using them "greener" with liquid-cooled PCIe cards that contain its highest-performing chips.

    At this year's Computex event in Taiwan, the computer graphics goliath revealed it will sell a liquid-cooled PCIe card for its flagship server GPU, the A100, in the third quarter of this year. Then in early 2023, the company plans to release a liquid-cooled PCIe card for the A100's recently announced successor, the Hopper-powered H100.

    Nvidia's A100 has already been available for liquid-cooled servers, but to date, this has only been possible in the GPU's SXM form factor that goes into the company's HGX server board.

    Continue reading

Biting the hand that feeds IT © 1998–2022