• The Deep View
  • Posts
  • ⚙️ Big Tech hyperscalers are building data centers in water-scarce locations

⚙️ Big Tech hyperscalers are building data centers in water-scarce locations

Good morning, and happy Friday.

The stock market is closed today in observance of Good Friday, which means one less day for investors to not really be able to process most of the things that are going on at the moment.

That’s gotta be a reason for celebration.

— Ian Krietzberg, Editor-in-Chief, The Deep View

In today’s newsletter:

  • 🔬 AI for Good: Regulatory RNA

  • 📊 AI firms dance between surging demand and growing uncertainty

  • 🌎 Big Tech hyperscalers are building data centers in water-scarce locations

AI for Good: Regulatory RNA

Source: Unsplash

When the Human Genome Project wrapped up 22 years ago, scientists were somewhat surprised by the results, namely, that the human genome contains only about 20,000 protein-coding genes (proteins are the ‘building blocks’ of life). 

Researchers expected there to be more; that number only equates to about 2% of the total genome. 

The remaining 98% of our genome — the “dark side” of the genome — was dismissed as irrelevant. But in the decades that have since elapsed, scientists have learned that the “dark side” of the genome is vitally responsible for regulating the remaining 2%. 

What happened: Biotech firm CAMP4 is building AI and machine learning models to find and target specific molecules — regulatory RNAs, or regRNAs — that directly control “the expression of nearby protein-coding genes.”

  • Specifically, CAMP4’s goal is to develop treatments for genetic disorders that result from too little or too much protein production. By identifying the specific regRNA molecules that control the protein production of each gene, they can tailor-make treatments to increase the protein production of under- or over-performing genes. 

  • The firm’s AI-powered platform identifies the interactions that occur in the “neighborhood” that contains the disease-related gene, further directing the team to the regRNA molecules that have the greatest impact on that specific disease-related gene. 

CAMP4 has already identified multiple targets and is in various pre-clinical phases. 

“If I just want to fix a single gene’s defective protein output, I don’t want to introduce something that makes that protein at high, uncontrolled amounts,” Dr. Richard Young, a co-founder and MIT professor, said in a statement. “That’s a huge advantage of our approach: it’s more like a correction than a sledgehammer.”

Grab Your Copy Of “The State Of AI In Business” Report

You’re a reader of The Deep View, which means you already know the immense value of AI. But, like any budding industry, there are still so many questions to be answered:

The good news is, Pipedrive has done the heavy lifting to get those answers for you – and they’ve put them all in their brand new “The State Of AI In Business” report, which you can download for free right here. This report contains responses from 500 businesses on how they’re using AI, what challenges they face, and what we all can expect for the future. It’s a treasure trove of info, and it’s free to download right now.

AI firms dance between surging demand and growing uncertainty

When Taiwan Semiconductor Co. (TSMC) reported first-quarter earnings Thursday, the results were largely quite positive: $25.53 billion in revenue, a 35% year-over-year increase (that’s still down 5% sequentially); a 60% year-over-year spike in profit to $11.12 billion; and a forecast that affirms a 20% growth in sales and a doubling in revenue from AI-related servers. 

  • “Our business in the first quarter was impacted by smartphone seasonality, partially offset by continued growth in AI-related demand,” TMSC’s CFO, Wendell Huang, said in a statement. 

  • As Nvidia’s main supplier, TSMC has been a massive — if slightly more subtle — beneficiary to the AI boom; similar to Nvidia, TSMC’s performance acts as a good bellwether for the rest of the industry, as flagging demand will be noted by the semiconductors first. 

For now, TSMC expects that strong AI-related demand to continue. 

Still, Taiwan-listed shares of TSMC fell about 1% following the report. And rather than bolstering other chip- and AI-related stocks, the sector, led by Nvidia, Broadcom, Intel and AMD, tumbled Thursday. 

Noting the macroeconomic environment, one marked by global tariffs alongside increasingly strict export restrictions, Huang wrote: “While we have not seen any changes in our customers’ behavior so far, uncertainties and risks from the potential impact from tariff policies exist.”

The acknowledgement of uncertainty echoes a similar note made by ASML, though it’s a positive sign that TSMC’s guidance remains intact for the moment. 

  • In March, TSMC said it would invest an additional $100 billion to build a total of six U.S.-based facilities, bringing its total U.S. investments up to $165 billion.

  • In April, President Donald Trump said he threatened the company with a 100% tax if they didn’t move more of their production to the U.S.

And while TSMC hasn’t noticed a tariff impact yet, semiconductor orders tend to be longer-term, and the first quarter wrapped before the tariff announcements entered into play. 

With the “reciprocal” tariffs on pause for the moment, I sat down with supply chain management expert, Dr. Nada Sanders, to unpack the impact the tariffs and export restrictions are likely to have on the field of AI, importantly covering the enormity of the challenge involved in bringing semiconductor manufacturing to the U.S. 

You can check out the episode here.

  • Bad trends: The latest viral trend that OpenAI’s o3 and o4-mini kicked off isn’t so great — users are dropping images into ChatGPT and asking it to identify the location pictured. While previous models were quite capable in this domain as well, the capability is one that poses enormous privacy and security risks, one that you’d think might require a set of guardrails to prevent.

  • Remember Monopoly? Google has one: A federal judge has ruled that Google has an illegal monopoly in ad technology, setting up the possibility of a breakup. But the outcomes here are far from certain (and likely far from occurring) — the judge will next hear arguments regarding possible remedies for the monopoly. Between appeals, though, the case could drag on for years.

  • Is this a hint of life on another world, or just a lot of hot air? (NPR).

  • Meta says it will resume AI training with public content from European users (ABC).

  • Trump Administration considers bans and restrictions on China’s DeepSeek (NYT).

  • Popular AI ranking website Chatbot Arena is becoming a real company (Bloomberg).

  • Brain implant cleared by FDA for Precision Neuroscience, a Musk Neuralink rival (CNBC).

Hiring smarter starts with understanding salary trends.

  • Get global salary data on engineers, designers, product managers and more.

  • Explore top-tier talent with experience at AWS, Google, PwC, and beyond.

  • Learn how to cut hiring costs and reinvest in growth.

Big Tech hyperscalers are building data centers in water-scarce locations

Source: Unsplash

As developers have raced to outdo each other in generative AI benchmark scores and chatbot arenas, pushing hard to make GenAI a daily-use tool for as many people as possible, a similar race has been going on behind the scenes. A hardware race. 

Generative AI — like all other internet and cloud based technologies — exists at the tip of a vast hardware pipeline, a pipeline that includes undersea cables, advanced computer chips (Nvidia’s GPUs), the data centers that house them, the electricity needed to power those data centers and the water needed to cool them. 

Those last two points are intimately related. 

Unlike their CPU cousins, GPUs are significantly more energy-intensive. As a result of this, they run hot. But to keep ChatGPT’s lights on, the data center operators need to keep those chips as temperate as possible; and so, much like your mom might put a cool cloth on your feverish forehead, those data center operators largely use water to keep their chips chill. 

We’ve talked often about the broader climate impacts resulting from AI-optimized data centers — a level of electricity consumption that weakens grids, incentivizes dirty energy supplies and leads to more carbon emissions. But when it comes to the water consumption of these data centers, the impact is less global and more local

What happened: An investigation by Source Material and The Guardian found that the three major hyperscalers — Microsoft, Amazon and Google — are operating highly water-intensive data centers (with plans to build more) in “some of the world’s driest areas.” 

  • This is where localization becomes important. In seeking to quantify the water consumption of these data centers, a team of researchers found that training GPT-3 led to the direct evaporation of 700,000 liters of local freshwater. And that’s just the training, which is a finite process; that number doesn’t include ongoing water costs from continually operating these data centers. 

  • As one of those researchers, Shaolei Ren, explained to me on The Deep View: Conversations: “whether that’s considered large or small really depends on the context. If you’re talking about evaporating that water in Arizona, it’s probably not the perfect thing to do.” 

The water consumption Ren was talking about here is different than, for example, you taking a shower. 

“Water consumption is a technical term. It means the difference between water withdrawal minus water discharge. When you take a shower, you’re withdrawing a lot of water, but you’re not consuming water … most of the water you take for showering will be going into the sewage immediately, and that can be reused,” Ren said. “Water evaporation for data center cooling is just evaporated into the atmosphere. Eventually, it will come back. But when it comes back and where it comes back, that’s really uncertain, especially in places like California.”

The details: The Source Material investigation found that, together, the three companies operate 38 data centers in parts of the world that are already facing water scarcity. A further 24 data centers in similar regions are under development. 

  • There are roughly 12,000 operational data centers around the world today. About half of them exist in the U.S. And those three major companies are developing hundreds of additional data centers as we speak.  

  • In parts of the world where water is not scarce, all that water consumption is less of a problem (we can instead worry about carbon emissions and electricity consumption). But in parts of the world where water is scarce, it’s not good. In its 2023 ESG report, Microsoft said that 41% of its total water withdrawal came from areas under water stress. That amounted to 5,326 megaliters, which equates to roughly 5.3 billion liters. For context, the average person living in the U.S. uses around 310 liters of water per day. 

Similarly, Google’s 2024 environmental report noted that 15% of its water withdrawal came from areas with high water scarcity, while a further 16% came from places with medium water scarcity. 

And Amazon is planning a bunch of new data centers in northern Spain, a country dealing with an active threat of desertification. The planned centers — approved to use some 755,720 cubic meters of water per year — have incited fierce opposition from a number of local groups who have cited severe concerns over the ecological impact. 

One local farmer told The Guardian: “These data centers use water that comes from northern Aragon, where I am. They consume water — where do they take it from? They take it from you, of course.”

Nothing is ever free. 

There are just varying degrees of visibility around what, exactly, we pay for the things we use. 

In the case of social media — and, to a large degree, generative AI — the cost of admission is your data, not your dollar. That cost is often obscured, and, to many, wasn’t made clear at the time of sign-up. 

Now, of course, it’s too late for such information to matter to many people. 

‘Take my data; I need to use this platform.’

As degrees of cost visibility go, the environmental impacts of using the internet have been practically invisible since its inception. But, with generative AI, they are mounting, and they are actively harmful to people who are attempting to adapt to a rapidly changing climate. 

Collectively, we need to think a lot harder about the benefits of use that make the costs worthwhile. And we need to think harder about who will bear the brunt of those costs. It is easy to dismiss the water consumption issues going on in Spain, if you do not live in Spain, just as it is easy to dismiss the xAI-related air pollution problems of Memphis, Tennessee, if you do not live in Memphis, Tennessee. 

It would be a mistake to do so. 

That our actions have more intense consequences in certain locales should be an even greater reason for caution, caution that pushes for regulation and regulation that drives the kind of innovation that enables these companies to have their data centers without sucking these regions dry. 

Which image is real?

Login or Subscribe to participate in polls.

🤔 Your thought process:

Selected Image 2 (Left):

  • “I cannot point to any specifics, but the water looked better in Image 2.”

Selected Image 1 (Right):

  • “Thought image 1 was better focused! Wrong!”

💭 A poll before you go

Thanks for reading today’s edition of The Deep View!

We’ll see you in the next one.

P.S. Enjoyed reading? Take The Deep View with you on the go! We’ve got exclusive, in-depth interviews for you on The Deep View: Conversations podcast every Tuesday morning. Subscribe here!

Is Google getting broken up?

Login or Subscribe to participate in polls.

If you want to get in front of an audience of 450,000+ developers, business leaders and tech enthusiasts, get in touch with us here.