DARPA collab launches fast cloud-to-cloud provisioning

Yet another elastic cloud


A DARPA-driven project based on OpenStack has been demonstrated in the US, with the bold claim that it will eventually lead to sub-second provisioning for connectivity between clouds.

The world is already familiar with the concept of elastic clouds, with Amazon, Google, Microsoft, and World+Dog offering some variant on such themes for customers on their services. Cloud-to-cloud elasticity is another matter, since carriers and their optical networks have to be pulled into the stack.

IBM, AT&T, and Applied Communications Sciences worked together on the project, which IBM describes as a proof of concept demonstrating “a cloud system that monitors and automatically scales the network up or down as applications need”.

The basic signalling is quite simple, Doug Freimuth from IBM Research explains in the post we've linked to above: “It works by the cloud data centre sending a signal to a network controller that describes the bandwidth needs, and which cloud data centre need to connect”.

For that, Freimuth says, the system needs an orchestrator between data centres, akin to the in-data-centre orchestration already pursued by cloud vendors and the open source world alike.

The idea is also to get rid of truck rolls to wind up or down the WAN muscle connecting the clouds, instead assuming that the optical medium has the necessary bandwidth, and all that's really needed is for the carrier's equipment to be able to respond to provisioning requests and pass that change onto billing systems.

According to R&D Magazine, AT&T developed the bandwidth-on-demand networking architecture, while ACS provided its optical-layer routing and signalling technology.

AT&T Labs' Robert Doverspike said the proof-of-concept combined SDN concepts with “advanced, cost-efficient network routing in a realistic carrier network environment”.

“This prototype was implemented on OpenStack, an open-source cloud-computing platform for public and private clouds, elastically provisioning WAN connectivity and placing virtual machines between two clouds for the purpose of load balancing virtual network functions,” R&D Mag continues.

The prototype was developed under DARPA's seven-year-old CORONET program. ®


Other stories you might like

  • DuckDuckGo tries to explain why its browsers won't block some Microsoft web trackers
    Meanwhile, Tails 5.0 users told to stop what they're doing over Firefox flaw

    DuckDuckGo promises privacy to users of its Android, iOS browsers, and macOS browsers – yet it allows certain data to flow from third-party websites to Microsoft-owned services.

    Security researcher Zach Edwards recently conducted an audit of DuckDuckGo's mobile browsers and found that, contrary to expectations, they do not block Meta's Workplace domain, for example, from sending information to Microsoft's Bing and LinkedIn domains.

    Specifically, DuckDuckGo's software didn't stop Microsoft's trackers on the Workplace page from blabbing information about the user to Bing and LinkedIn for tailored advertising purposes. Other trackers, such as Google's, are blocked.

    Continue reading
  • Despite 'key' partnership with AWS, Meta taps up Microsoft Azure for AI work
    Someone got Zuck'd

    Meta’s AI business unit set up shop in Microsoft Azure this week and announced a strategic partnership it says will advance PyTorch development on the public cloud.

    The deal [PDF] will see Mark Zuckerberg’s umbrella company deploy machine-learning workloads on thousands of Nvidia GPUs running in Azure. While a win for Microsoft, the partnership calls in to question just how strong Meta’s commitment to Amazon Web Services (AWS) really is.

    Back in those long-gone days of December, Meta named AWS as its “key long-term strategic cloud provider." As part of that, Meta promised that if it bought any companies that used AWS, it would continue to support their use of Amazon's cloud, rather than force them off into its own private datacenters. The pact also included a vow to expand Meta’s consumption of Amazon’s cloud-based compute, storage, database, and security services.

    Continue reading
  • Atos pushes out HPC cloud services based on Nimbix tech
    Moore's Law got you down? Throw everything at the problem! Quantum, AI, cloud...

    IT services biz Atos has introduced a suite of cloud-based high-performance computing (HPC) services, based around technology gained from its purchase of cloud provider Nimbix last year.

    The Nimbix Supercomputing Suite is described by Atos as a set of flexible and secure HPC solutions available as a service. It includes access to HPC, AI, and quantum computing resources, according to the services company.

    In addition to the existing Nimbix HPC products, the updated portfolio includes a new federated supercomputing-as-a-service platform and a dedicated bare-metal service based on Atos BullSequana supercomputer hardware.

    Continue reading

Biting the hand that feeds IT © 1998–2022