How to build sustainable AI? Start with the network
Baking in sustainability begins with smart network design
Sponsored Feature The AI revolution is redrawing the business and consumer landscape across the globe. But it is also changing how telco and cloud service providers design and build out the infrastructure that will enable this revolution.
The levels of investment pouring into AI and communications are staggering. Earlier this year, IDC predicted that AI server and storage infrastructure spending would hit $200bn by 2028. Service providers will dominate this spending, the firm said in August, as they support "massive" increases in agentic workloads.
Meanwhile, the telecoms sector will steadily increase its capital investment by 2.4 percent CAGR from 2024 to 2028, according to a recent forecast by PWC. Operator spending is currently focused on fixed connectivity, PWC found, but 6G is on the horizon which will lead to more investment in wireless infrastructure.
But building and running this infrastructure also takes a massive amount of resources, particularly energy. Datacenters alone accounted for 1.5 percent of worldwide electricity demand in 2024, according to the International Energy Agency, a share that is set to hit three percent in 2039. Data transmission networks currently account for a similar share of electricity use as datacenters, the agency says.
So, the pressure is on providers and enterprises to not just deploy AI services, but to demonstrate ROI on these investments, and increasingly, to do so sustainably.
At ZTE's Global Summit and User Congress in Milan, leaders from the Chinese technology giant explained just what this means for its customers and what it is doing to help them.
While the technology industry has focused on decoupling and modularizing technologies and products over the last decade, when it comes to delivering AI, compute and connectivity are increasingly intertwined, ZTE argues.
Hyperscale datacenters used for training models rely on high performance networking to connect clusters of GPUs, and shift the massive datasets needed. Meanwhile, workloads such as inference or smart manufacturing mean the edge is more important than ever, raising the bar when it comes to latency. And consumers and workers alike will be increasingly demanding seamless AI-powered services, whether over fixed or wireless networks.
Peng Aiguang, SVP of ZTE and president of the Europe and Americas regions, said that since the AI explosion caused by the launch of ChatGPT, the firm had updated its strategy from connectivity to "connectivity+computing" as it sought to support its proposition of "all in AI, AI for All".
Peter Hu, vice president of ZTE and general manager of wired product marketing, added that projections showed that AI will account for 60 percent of network traffic by 2033. This means that "AI and networks are no longer isolated technologies. They are increasingly integrated and interdependent."
Pulling together
But achieving this integration poses a range of challenges, Hu said, "including fragmented computing resources, latency constraints, and performance inconsistencies."
So across its wireline portfolio, Hu said, ZTE had developed its AI optical strategy to support intelligent AI-native architectures. "Our goal is to support telecom operators in transitioning from simply selling bandwidth to offering computing power as a service," he said. This approach extends across the infrastructure, campus and home domains he added.
But supporting the roll-out of AI is not just a question of speeds and feeds. The market is moving incredibly quickly, meaning operators and enterprises are in a race to design and deploy their networks. So, Hu explained: "In FTTx, our Light PON and Light ODN solutions feature a streamlined architecture that enables rapid cost-effective deployment."
This includes AI enhancements to its planning tool to support one-click network design. "Additionally, our fiber fingerprint technology automatically builds and maintains a complete network map, slashing fault detection time."
ZTE's RAN product line has also benefited from having AI integrated into the architecture. Peng Aiguang explained that the RAN range incorporates AI agents to improve spectrum efficiency, operations and maintenance and energy efficiency.
And the core networking line is now dubbed AI Core, Peng continued. This reflects not just that it is underpinning the movement of the massive amounts of data AI ecosystems demand, but that ZTE is integrating large language models and communication specific language models into the infrastructure itself. "It helps operators transform from pipe providers to value enablers," he explained.
Similarly, Peng continued, AI is driving exponential growth in wireline traffic, meaning higher development requirements and more transmission pressure on optical networks. In response, ZTE has combined C-band and L-band communications is a single architecture.
"Our solution supports 400G/800G/1.6T per wavelength and enables single fiber capacity of nearly 100T," explained Peng. "This represents a 25 percent increase in system capacity over traditional solutions."
But speed and bandwidth are not the only factors that operators, enterprises, and ZTE has to take into account when meeting the demands of AI rollouts.
Doing it all, sustainably
Hu Kun, president for Western Europe at ZTE and CEO of ZTE Italia, said the company was acutely aware of the environment that European companies have to operate in, for example.
"Energy optimization and carbon footprint reduction are now as important as performance and scalability," he said. "At the same time, cost efficiency remains critical, so solutions need to be modular, simple and quick to deploy."
So, he said, "ZTE's datacenter solutions emphasize energy efficiency and green operation, helping enterprises meet EU sustainability targets while controlling costs." But, he said, the firm is also very conscious of the data environment for European organizations, which means they need to process data locally for both speed and compliance reasons.
"With Europe's strict data sovereignty requirements, edge computing is becoming a key enabler," Hu explained. "Enterprises want seamless orchestration between local edge, private datacenter, and the public cloud."
Ultimately, said Peng, "In the AI era user demands are no longer simply about network bandwidth or computing power, but rather task requirements."
The network will underpin the data requirements of AI. But, he added, "AI will become the brain of the computing networking, dynamically scheduling the optimal computing and network resources from the cloud to the edge to meet these demands."
On one level, this will mean network as a service becomes a reality. On another, this means the potential of AI becomes transformed into tangible value, boosting the economy, but also transforming industry, healthcare, and delivering the truly smart home. And that is where people will really feel the impact of computer and networking convergence.
Sponsored by ZTE.