Public cloud will grow when experienced IT folks DIE

Clouds get real when the Facebook kids sit in the big chair

Analysis Major adoption of public cloud computing services by large companies won't happen until the current crop of IT workers are replaced by kiddies who grew up with Facebook, Instagram, and other cloud-centric services – so says Rackspace CTO John Engates. Should we be worried?

"10 to 15 years ago no one would put their credit card on the internet either – it takes time to soak in this stuff," Engates told The Reg.

He believes the misgivings that some people feel about putting their data in distant multi-tenant bit barns comes from a lifetime spent maintaining and fiddling with servers and a deep distrust of companies offering to do it for them.

"I just think it's a time thing," Engates says. "Those guys that are the older guys in IT will retire and the new guys that are the Facebook generation or the Instagram generation will become the new guys, and they will have only lived in the cloud era."

This statement somewhat glosses over the fact that many people are employed in organizations to run data centers, and a sudden keenness on cloud by the bosses can cause admins to have visions of pink slips arriving on their desks.

"To some degree there is a generational issue. However, it's also true that it's about how you believe you're valued by your org. If you feel that building something makes you more valuable, then you're less likely to want someone else to do it," Mark Thiele, executive vice president of data center tech at Switch, told The Reg via email.

But though IT workers can use their experience and knowledge to counsel their bosses against moving to the cloud, things get tricky when new hires are cloud cultists.

The rub is that young IT devs will have grown up with cloud-oriented ways of developing code and interacting with businesses, so remotely hosted as-a-service technologies will be their default way of solving many technical problems, Engates says.

This is already happening: the majority of startups this hack speaks to are enthusiastic users of public clouds such as Amazon Web Services, because it means they don't need to make any capital expenditures, nor take time away from developing their applications to maintain infrastructure.

"We're totally in the cloud + a little colo for 24/7 testing," Bradford Stephens, chief executive of database startup Drawn to Scale told The Reg via email. "We have large Linux development desktops for 'virtual clusters', but that's it."

"I'm sure there's a generational bias when it comes to using the cloud... I think grizzled IT folks don't trust anything they can't hit the reset button on," he said*.

Won't someone think of the children?

Rackspace expects the same attitude in its young employees, Engates says:

The next 18-year-old guy that we hire to be a developer here will have never experienced the client/server era, he will have only lived in the 'ability to spin up a server on the cloud era' – and why would he think twice about building a datacenter? He hosts his code on github, he lives in the public cloud, he lives in a Facebook world, and he just would not have a second thought about doing it that way – the trend is we will go in that direction.

Not understanding the ins and outs of data center infrastructure can come back to hurt companies – unsophisticated startup Rap Genius suffered from poor application performance for two years because its developers put all their trust in the Heroku platform-as-a-service they sat upon, and didn't run basic tests.

It also opens companies up to an unhealthy dependance on their service provider, so when Amazon or Google or Microsoft have a data center brownout, thousands of businesses that depend on their services can be knocked offline. Only the largest cloud-users can avoid going down, as Netflix has done in the past, and for many startups failure becomes a way of life.

But for all its faults, Engates believes use of the cloud will keep on growing, because young developers find its technologies irresistible, and get hooked on a diet of always available services, delivered from a homogeneous mass of IT equipment.

"Developers want to use tools that make their life easy and allow them to go fast," he says.

This Reg hack thinks that humanity's great capacity for laziness is also going to put pressure on organizations to adopt the cloud, because it will be all their devs can know – why bother learning about chaining together data center infrastructure when packaging up someone else's technology can earn you $30m, as Nick D'Aloisio of Summly found.

Unless regulation mandates you to develop on-site datacenter expertise, what incentive is there for a business to plough resources into building its own technical capabilities? And what happens if you build a datacenter at the top of your demand curve – as Zynga has done with its hugely expensive "Z-Cloud" – leaving you with unused IT assets and a whopping facilities bill? Pink slips all round, we imagine.

If all the money is flowing to startups built on top of cloud systems, then what incentive is there for a young developer to learn how to physically maintain a fleet of servers?

Could this not be part of a real transition in technology, rather than a fashion craze? Much of the technology industry is barreling down a path that sees major OEMs plough money into the research and development of low-cost commodity datacenter hardware, and software tools for managing it all are coming to the fore. The rise of technologies such as OpenStack and the huge investment companies see in software-defined networking all point to a future in which power lies with the ability to manipulate software-based infrastructure control planes, and knowledge of actual hardware loses its value.

Perhaps cloud-skeptics are on the wrong side of history, and organizations that have a no-cloud policy are destined to die out – excluding those where regulation mandates it, of course.

But what of the holdouts who try to stop their organizations' data leaking onto the cloud? What about the engineers within companies who think maybe it's not a good idea to get hooked on the cloud, due to the loss of autonomy it gives a company (and the lack of control) – what about them?

"The conservative people, the people that are less likely to go to cloud – they'll just get run over," Engates says. ®


The cloud cultists of today could wind up being the server lovers of tomorrow, having the same relationship with cloud management tools as existing IT pros have with favored bits of hardware. "It's likely that some of the new folks will become huggers of something else (a particular cloud platform, puppet or chef, etc.)," Thiele says.

Other stories you might like

  • New audio server Pipewire coming to next version of Ubuntu
    What does that mean? Better latency and a replacement for PulseAudio

    The next release of Ubuntu, version 22.10 and codenamed Kinetic Kudu, will switch audio servers to the relatively new PipeWire.

    Don't panic. As J M Barrie said: "All of this has happened before, and it will all happen again." Fedora switched to PipeWire in version 34, over a year ago now. Users who aren't pro-level creators or editors of sound and music on Ubuntu may not notice the planned change.

    Currently, most editions of Ubuntu use the PulseAudio server, which it adopted in version 8.04 Hardy Heron, the company's second LTS release. (The Ubuntu Studio edition uses JACK instead.) Fedora 8 also switched to PulseAudio. Before PulseAudio became the standard, many distros used ESD, the Enlightened Sound Daemon, which came out of the Enlightenment project, best known for its desktop.

    Continue reading
  • VMware claims 'bare-metal' performance on virtualized GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual updates across CPU, GPU, and DPU lines
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading
  • AWS puts latest homebrew ‘Graviton 3’ Arm CPU in production
    Just one instance type for now, but cheaper than third-gen Xeons or EPYCs

    Amazon Web Services has made its latest homebrew CPU, the Graviton3, available to rent in its Elastic Compute Cloud (EC2) infrastructure-as-a-service offering.

    The cloud colossus launched Graviton3 at its late 2021 re:Invent conference, revealing that the 55-billion-transistor device includes 64 cores, runs at 2.6GHz clock speed, can address DDR5 RAM and 300GB/sec max memory bandwidth, and employs 256-bit Scalable Vector Extensions.

    The chips were offered as a tech preview to select customers. And on Monday, AWS made them available to all comers in a single instance type named C7g.

    Continue reading

Biting the hand that feeds IT © 1998–2022