Smut slinger dreams of AI software to create hardcore flicks with your face – plus other machine-learning news
Roundup It's a long weekend in England and Wales, with many Reg vultures taking time out and making the most of what's left of the quiet August month.
We haven't forgotten you, though, so here's a roundup of artificial intelligent software related tidbits.
Oh dear, a porn company wants to monetize the deepfakes craze: Hey, remember when internet perverts used AI to stitch famous people's faces onto the bodies of smut flicks, and generate X-rated vids of celebs, dubbed "deepfakes"? It sparked outrage, with people freaking out over this direction of deep-learning technology, and how it could be used by people with little or no coding background to craft almost believable bogus porno. Tools emerged to detect faked vids.
Well, now porn biz Naughty America wants to sell this sort of caper to horny netizens. You have to swear you're uploading footage of your own face – and not an ex-girlfriend or a stranger's – pulling various expressions of pleasure. The service then spits out a realistic-looking saucy scene with you pasted into it.
Naughty America is also offering other options such as photoshopping performers so it looks as though they’re going at it in any room of your choice. Maybe your room? Maybe you with someone else in your room? To stop this being abused, the smut slinger promises to ensure it has the consent of everyone pictured – and as we all know, it's completely trivial to check...
Right now, it sounds as though Naughty America is doing fancy editing, automated and customized for its punters, rather than full-on AI, all while, ahem, riding the deepfakes bandwagon. But, y'know, it could be the start of something new. The biz is looking to use AI to improve its output, such as making adult stars say your name during, well, you get the idea. “It’s exciting,” CEO Andreas Hronopoulos told Variety. “We see customization and personalization as the future.”
Facebook AI researchers and NYU team up to study MRI: Eggheads at Facebook and New York University in the US have announced a new project: fastMRI.
The goal is to work out how AI can be used to speed up magnetic resonance imaging (MRI) scans. At the moment, it can take roughly 15 minutes to more than an hour, depending on how much body tissue needs to be scanned, which is longer than X-ray and CT scans. There are many benefits to shortening MRI scans: for example, more people can be seen; the technology can be used to replace CT and X-ray scans, thus reducing the amount of ionizing radiation patients and doctors are exposed to; and it's easier to lie still for a few minutes than an hour or so.
NYU has amassed about three million MRI images of knees, brains and livers to train convolutional neural networks built by Facebook. MRI scanners work by building up an internal picture of the body, slice by slice of 2D images. Collecting more slices during a scan generates a more accurate 3D model, although it takes longer to complete the scan. Taking fewer 2D slices, and using an AI to fill in the gaps, will reduce the time taken in an MRI machine.
“Using AI, it may be possible to capture less data and therefore scan faster, while preserving or even enhancing the rich information content of magnetic resonance images. The key is to train artificial neural networks to recognize the underlying structure of the images in order to fill in views omitted from the accelerated scan,” according to Facebook’s blog post.
Nvidia's DGX-2 NVswitch-eroo: Nvidia’s CEO Jensen Huang introduced the DGX-2 as the “world’s largest GPU” at Nv's annual GPU Technology Conference earlier this year. The DGX-2 system is actually made up 16 V100 GPUs each with 32GB of memory, as well as two 24-core Xeon CPUs, 1.5TB of DDR4 DRAM memory, and 30TB of NVMe storage. It maxes out at two petaFLOPS of performance.
What makes the DGX-2 act as a single GPU is the way that all 16 are connected. Nvidia has posted a explanation this month, describing how its NVSwitch chips together act as a crossbar to hook up all the GPU chips in the box at once, using NVLink as the interconnect. Any two GPUs in the 16-chip cluster can talk to each other using NVLink at 300GB/s via the network of NWSwitch chips, which is faster than going over the PCIe bus.
AI translation is good for business: Academics have found that machine translation is good for attracting international customers on e-commerce platforms.
The paper emitted by the National Bureau of Economic Research in America this month reports that there was a 17.5 per cent increase in exports on eBay due to folks using the website's electronic translators. The boffins at the Massachusetts Institute of Technology and Washington State University, in the US, analyzed English to Spanish interactions using eBay Machine Translation (eMT) on the cyber-souk, a language translation system integrated onto the platform.
“Specifically, we find that efforts to remove language barriers provide substantial increases to market efficiency as well as platform profit,” the paper's authors stated in their work.
It’s not too surprising. It’s easier to find items if eBay expands the search across various descriptions written in different languages. Users are more likely to buy things if they can see the item’s information, such as its brand, size, color, etc, in their own language.
It’s interesting to think about see how machine translation could affect other sites. What if El Reg articles could be effectively translated without having to resort to Google Translate, and thus millions of people from non-English-speaking countries showed up. Just imagine the comments.
Just in case you missed it… Open AI entered its bots into The International, the biggest tournament for the esports video game Dota 2 – but it called it quits after its agents lost two matches in a row against skilled humans.
The stakes were high, OpenAI took a big risk showcasing its technology in public, and we're told learned a lot from the experience, which can be used to further train and develop its machine-learning software.