5 minute read

The hidden environmental costs of a centralised cloud

CUDO Ventures

CUDO Ventures

The environmental impact of cloud computing is becoming increasingly well known. Over the past year, mainstream outlets have begun to recognise that the cloud is not the insubstantial, weightless thing its name implies.

In fact, the cloud relies on an extensive physical infrastructure that requires enormous quantities of energy to power it. As a result, it generates significant carbon emissions. A report by the French think tank The Shift Project garnered significant press attention when it revealed that the cloud’s carbon footprint is now equal to that of commercial plane travel.

The hyperscale cloud giants – Amazon, Microsoft and Google in particular – have taken steps to head off this negative press. All three have made superficially impressive sustainability promises, committing to reach carbon neutrality or even to be carbon negative by a given date.

But the sustainability promises of the hyperscale providers aren’t all they seem. Earlier this year, the Corporate Climate Responsibility Monitor report found that both Amazon and Google lack transparency when reporting on their emissions. As a result, their claims to be striving for sustainability are virtually impossible to substantiate.

And this isn’t the only problem. Focusing on carbon footprints as a marker of sustainability may be letting the centralised cloud providers off the hook. Carbon emissions are just one of the numerous ways the current cloud model damages the environment and undermines global sustainability goals.

Hyperscale cloud providers are accelerating a number of major threats to building a livable future for the planet, from ever-growing water consumption to the rising tide of e-waste.

In this post, we’ll be looking at some of the hidden environmental costs of the centralised cloud and showing why only a decentralised alternative can offer a sustainable future.

Going beyond the carbon footprint

It’s no secret that the demand for cloud computing has soared in recent years, and this trend is likely to accelerate over the coming decade. The shift toward flexible working and the growth of data-intensive tech will drive the demand for computing power to unprecedented heights.

And this means emissions will continue to skyrocket at a critical time for the earth’s climate.

The industry leaders (AWS, Google and Microsoft) have all committed to prioritising sustainability for their cloud operations. But despite their attention-grabbing promises, the reality is less clear.

Despite a gradual shift toward transparency on the part of the cloud giants, data on the emissions they generate is not always easy to come by – or to interpret.

For instance, the dominant focus on carbon footprints can obscure the fact that greenhouse gas (GHG) emissions cover a range of categories, some of which are easy to overlook.

The widely used GHG Protocol standard separates emissions into three “scopes”:

  • Scope 1 refers to direct emissions from a company’s operations.
  • Scope 2 covers the impact of the energy a company uses to power these operations.
  • Scope 3 refers to “upstream and downstream” emissions in a company’s value chain.

While reporting Scope 1 and 2 emissions is relatively straightforward, Scope 3 is more complex. It requires companies to explore how their supply chains and product lifecycles impact the environment.

But given that Scope 3 emissions form a significant majority of the emissions for most big tech companies – in Google’s case, 60% of their emissions fall under Scope 3 – the failure to adequately account for them is concerning.

Not only do the hyperscale providers currently aggregate their Scope 3 emissions into their global business reporting, making it impossible to assess the true impact of their cloud computing efforts, but both Amazon and Google also exclude them from their emissions targets.

The impact of ignoring Scope 3 GHG emissions could be massive. A recent bombshell report found that Google’s carbon emissions would be more than double the reported amount if they factored in the footprint of their corporate cash and investment activities, as recommended by the GHG Protocol.

The growing threat of water scarcity

While GHG emissions may be underreported, the cloud giants are at least taking steps toward accountability in this area. When it comes to the other environmental costs of the centralised cloud, there is far less transparency.

Hyperscale data centers require enormous quantities of water to keep them cool. Unlike with energy consumption, however, there has been little incentive for data centre operators to track and report water usage.

A recent report by Uptime Institute found that only half of data centre operators track water usage. That’s because water is largely considered to be an inexpensive commodity that can be used with impunity. As we’ll see below, that may soon change.

For hyperscale providers, there has been active reluctance to make their water usage public. Google, for instance, considered the amount of water its data centers used to be a trade secret. Nevertheless, the company‘s latest sustainability report indicates that, across the business, Google’s water usage has more than doubled in the past five years. It is safe to assume that this is partly attributable to the growth of its cloud services.

As the climate crisis begins to bite, water scarcity is becoming a major threat. Devastating heatwaves are sweeping the world, making drought a global issue. This brings the monumental water requirements of hyperscale data centers into sharp relief.

The Southwestern US has been a flashpoint for struggles over the impact of data centers on local water supplies. Last May, the vice mayor of Mesa, Arizona, protested the $800m development of a massive new data centre just outside the city. The same month, the region was devastated by a drought of unprecedented proportions, with 38% of the region categorised as experiencing “exceptional drought”. The vice mayor remarked, “We are on red alert, and I think data centers are an irresponsible use of our water.”

Yet despite such concerns, a recent study claims that US data centers continue to display “a disproportionate dependency on scarce waters in the Western US”. Forced to choose between protecting over-stressed watersheds or accepting multi-million-dollar investments from big tech companies, local municipalities often put concerns about water usage to one side.

The e-waste problem

Water usage isn’t the only thing that has fallen under the radar of sustainability concerns for the cloud computing industry.

While the Uptime Institute report found that only 51% of data centers tracked water usage, the number that tracked e-waste was even lower, at just 25%. At the same time, data centers are making a fast-growing contribution to the world’s e-waste problem. According to a 2019 report, data centers are responsible for generating 2m tonnes of e-waste per year – approximately 4% of the world’s total.

The environmental impact of e-waste is enormous, and disproportionately affects those in developing countries. 64% of the EU’s e-waste ends up in Africa, according to a 2019 report. Moreover, the disposal process is often informal and highly dangerous. According to the World Health Organisation, children often take part in the work, getting exposed to toxic chemicals in the process.

Despite these environmental threats, hyperscale providers continue to build new data centers at pace, requiring the manufacture of extensive new hardware. A recent report by Synergy Research Group noted that there are more than 300 hyperscale data centers in the pipeline. Unsurprisingly, the existing market leaders are responsible for the majority of these.

This raises a vital question. Though Amazon, Google and others claim to be making major strides toward improving the sustainability of their cloud services, this can at best mitigate the impact of building new infrastructure at such scale, not erase it.

In such circumstances, you’d be forgiven for asking: is there no other solution?

A sustainable future for the cloud

Following the recent mainnet launch of our blockchain network, Cudos is now turning its attention toward the pilot phase of its decentralised cloud solution, CUDO Compute.

The goal of CUDO Compute is to provide a sustainable and scalable alternative to the current highly centralised cloud. Core to the CUDO Compute model is the recognition that although centralised providers continue to build new infrastructure to power their operations, much of the world’s existing computing power is underutilised.

By providing a secure and decentralised cloud platform, CUDO Compute will help to build a world where no computing is wasted. It will allow those with spare computing power at all scales – from gamers to data centers and enterprise systems – to utilise this excess capacity. Not only will users be able to earn passive income, but they’ll also be able to contribute toward a greener future for the cloud by reducing the need for centralised providers to build new hyperscale centers.

We believe that the intensive demands of AI, machine learning, metaverses, and other innovations don’t require us to forget about our climate goals.

That’s why we’ve built an extensive partnership network with some of the leading figures in the sustainable tech space. We are working with both ClimateTrade and KyotoProtocol.io to support our commitment to reaching carbon neutrality. By working together, we can ensure that the future of the Web3 space is built with the environment in mind.


If it hasn't already been made apparent, we are on the verge of a global climate crisis.

What people may not know is that currently, our main way of combatting this problem is the issue of carbon credits ♻️

Read more in the 🧵 below

— KyotoProtocol.io (@official_kpio) July 19, 2022


Most recently, we’ve partnered with the Web3 education and recruiting network BlockBeam to help nurture the next generation of green tech talent. After all, decentralised solutions will require as many people as possible to come together and make them a reality.

Support CUDO Compute’s decentralised alternative

A sustainable, decentralised future for the cloud will require a collective effort. We have already built an extensive network of over half a million users across 145 countries, but as CUDO Compute enters its pilot phase we will need those looking to buy or sell compute to help bring our vision to life.

If you’d like to help support CUDO Compute, register your interest today.

You can also:

Come build with us today!

About Cudos

Cudos is powering the metaverse bringing together DeFi, NFTs and gaming experiences to realise the vision of a decentralised Web3, enabling all users to benefit from the growth of the network. We’re an interoperable, open platform launchpad that will provide the infrastructure required to meet the 1000x higher computing needs for the creation of fully immersive, gamified digital realities. Cudos is a Layer 1 blockchain and Layer 2 community-governed compute network, designed to ensure decentralised, permissionless access to high-performance computing at scale. Our native utility token CUDOS is the lifeblood of our network and offers an attractive annual yield and liquidity for stakers and holders.

Learn more about CUDOS:

Website, Twitter, Telegram, YouTube, Podcast, Discord, Medium

Subscribe to our Newsletter

Subscribe to the CUDO Compute Newsletter to get the latest product news, updates and insights.