HPC

Custom superchippery pulls 3D from 2D images like humans

Self-driving cars, 360°-vision hats ... much better CCTV


Computing brainboxes believe they have found a method which would allow robotic systems to perceive the 3D world around them by analysing 2D images as the human brain does - which would, among other things, allow the affordable development of cars able to drive themselves safely.

For a normal computer, or even a normal supercomputer, analysing 2D images of fast-moving traffic as quickly as a human does it is a massive task requiring colossal resources. For instance the Roadrunner hypercomputer, third most powerful in the world according to the latest rankings, is thought by boffins in the know to perhaps be capable of handling a car - though it would have do so by remote hookup, as it weighs more than 200 tonnes and requires three megawatts of power. This is obviously not going to be a mass-market solution.

But yesterday at the High Performance Embedded Computing (HPEC) workshop in Boston, engineers presented a new system dubbed "Neuflow". This uses custom hardware modelled on the brain's visual-processing centres, all built on a single chip. Its designers say that it can process megapixel images and extract 3D information from them in real time.

Not only can a Neuflow system, according to its inventors, process imagery at blistering speed: it is also small and economical of power.

"The complete system is going to be no bigger than a wallet, so it could easily be embedded in cars and other places," says Eugenio Culurciello of Yale uni's engineering department. Apparently such a Neuflow "convolutional neural network" machine would require only a few watts of power - it might, in fact, be a viable portable or wearable solution as well as vehicle-mounted.

This would mean that a robot car equipped with simple cameras could perceive the road, buildings, other cars and pedestrians in 3D: there would be no need for the expensive arrays of close-in laser scanner systems generally used on autonomous-car prototypes today.

Still smaller devices might be possible, perhaps allowing a soldier's helmet to watch all around him and pick out movement or threats: or permitting smaller robots to get about inside buildings or other cluttered environments without constant remote control from a human operator.

Needless to say, Neuflow tech could also hugely enhance the effectiveness of CCTV and similar surveillance systems. At present these are generally used reactively, well after a given event, and analysing their results eats up thousands of man-hours. Computers which could process images into moving objects would potentially be able to automate much of this and speed it up.

There's more here on convolutional neural networks for those interested, including code downloads and other goodies. ®

Similar topics

Broader topics

Narrower topics


Other stories you might like

  • Microsoft-backed robovans to deliver grub in London
    British startup Wayve gets supercomputing leg up

    Microsoft is pumping supercomputing oomph as well as funds into a British-born autonomous vehicle startup.

    On Wednesday Wayve, the upstart in question, confirmed it has struck a deal with Microsoft – not surprising since Redmond has already sunk a chunk of change into the business – to use Azure to train next-gen self-driving machines from data collected from human drivers out on the road. Richard Branson, Meta AI Chief Yann LeCun, and other heavyweights are also early investors alongside the Windows giant.

    "Joining forces with Microsoft to design the supercomputing infrastructure needed to accelerate deep learning for autonomous mobility is an opportunity that we are honored to lead," said Alex Kendall, CEO of Wayve.

    Continue reading
  • Cars in driver-assist mode hit a third of cyclists, all oncoming cars in tests
    Still think we're ready for that autonomous future?

    Autonomous cars may be further away than believed. Testing of three leading systems found they hit a third of cyclists, and failed to avoid any oncoming cars.

    The tests [PDF] performed by the American Automobile Association (AAA) looked at three vehicles: a 2021 Hyundai Santa Fe with Highway Driving Assist; a 2021 Subaru Forester with EyeSight; and a 2020 Tesla Model 3 with Autopilot.

    According to the AAA, all three systems represent the second of five autonomous driving levels, which require drivers to maintain alertness at all times to seize control from the computer when needed. There are no semi-autonomous cars generally available to the public that are able to operate above level two.

    Continue reading
  • Pony.ai lassos China's first autonomous taxi license
    100 self-driving cabs will wander streets of Guangzhou

    Residents of Chinese metropolises Guangzhou and Beijing may be in for a surprise the next time they hail a cab – some of them are now self-driving.

    Autonomous driving company Pony.ai is the operator, and the only business of its kind granted a license to run driverless cabs in China, the company said. It has tested vehicles, including a driverless semi truck, in all four of China's tier-one cities (Beijing, Shanghai, Guangzhou, Shenzhen), and actual service in Guangzhou marks its first formal deployment.

    According to Pony.ai, it had to meet stringent licensing requirements that included 24 months of testing in China or abroad, at least 1 million kilometers of driven distance, at least 200,000 of which must be driven in Guangzhou's automated driving test area. During the test period, Pony.ai also had to maintain a flawless driving record without any active liability accidents.

    Continue reading

Biting the hand that feeds IT © 1998–2022