- The Deep View
- Posts
- Accelsius and the drive for data center efficiency
Accelsius and the drive for data center efficiency
Good morning. Inspired by one of our readers, if any of you are interested in sending in photos for inclusion in our Real or AI section, feel free to submit them here.
Would love to see what you come up with (plus, you’d be sure to get at least one day right).
— Ian Krietzberg, Editor-in-Chief, The Deep View
In today’s newsletter:
🤖 AI for Good: Search and rescue bots
🏛️ Mark Zuckerberg told Meta engineers to train on copyrighted material
⚡️ Accelsius and the drive for data center efficiency
AI for Good: Search and rescue bots
Source: RIKA, India
In a world of both increasing urbanization and increasing environmental threats, including earthquakes, hurricanes, storms and wildfires, emergency response — particularly, search and rescue in urban settings — is becoming more of a challenge.
It’s one of many arenas that, according to some developers, is ripe for automation.
The details: Rika, an Indian research group, has developed a snakelike robot called COBRA that is designed to safely speed up search and rescue operations.
COBRA (the Collapse Building Search Assistive Robot) comes equipped with cameras and thermal sensors, all tied together with computer vision technology and other AI systems.
The robot can be deployed at rescue operations to identify where people are, and if they’re still alive, allowing rescue workers on the ground to work more quickly.
Unlike other rescue bots on the market, COBRA is far more accessible, specifically targeting low and middle-income countries and areas, which are traditionally underresourced.
“The problem is there are not enough people to do rescue work,” Ranit Chatterjee, co-founder & CEO of RIKA India, said. “The golden hour, which is the first one hour, is very crucial to save lives. That’s where automation is needed.”
COBRA was featured at the United Nations’ recent AI for Good, Impact India event.
Never Prep for a Sales Meeting Again
Bounti’s AI will save you 10+ hours/week of painful account research and meeting prep, empowering you and your team to sell more effectively through every channel and stage of the funnel.
Deliver the right message to your ideal buyer—driving real engagement at a fraction of the cost of other solutions.
Research: Get a cheat sheet for every account that matters to you. In minutes, you’ll have everything you need to make your next conversation the reason they buy.
Create: Our AI will generate custom pitches and any type of content you want, all fully customizable to what you need to land your deal.
Engage: Have the confidence to secure every conversation with detailed messaging and positioning that your buyers care about.
Mark Zuckerberg told Meta engineers to train on copyrighted material
Source: Meta
One of the class-action copyright lawsuits that Meta is currently faced with took a bit of a turn last week, following the release of unredacted documents that indicate that CEO Mark Zuckerberg himself specifically approved Meta’s Llama team to use a dataset the company knew to include pirated content.
The details: The case — Kadrey V. Meta — is one of many copyright lawsuits that have been filed against generative AI developers. This one includes Sarah Silverman and other notable authors as plaintiffs.
Meta wanted to keep this information private, a request the judge called “preposterous … It is clear that Meta’s sealing request is not designed to protect against the disclosure of sensitive business information … Rather, it is designed to avoid negative publicity,” U.S. District Judge Vince Chhabbria wrote. “This is reflected in a statement by a Meta employee from one of the documents Meta seeks to seal: ‘If there is media coverage suggesting we have used a dataset we know to be pirated, such as LibGen, this may undermine our negotiating position with regulators on these issues.’”
LibGen, a self-described “shadow library” that grants users free access to copyrighted material, has been sued numerous times, ordered to shut down and ordered to pay millions of dollars to rights holders.
According to the filing, top Meta engineers were initially hesitant to train Llama on LibGen, saying: “torrenting from a [Meta-owned] corporate laptop doesn’t feel right,” since it was a “dataset we know to be pirated.”
According to internal memos cited in the filing, following “‘escalation to MZ,’ Meta’s AI team ‘has been approved to use LibGen.’” (MZ here refers to Zuckerberg).
Meta, like many other developers, is maintaining its fair use defense, a claim that numerous legal experts find to be dubious at best. Still, it’s one that is being tested in a number of cases; this case itself is far from concluded — and at the moment, only pertains to earlier releases of Llama — though it certainly doesn’t help Meta’s claims that there is seemingly evidence of knowing infringement of copyrighted content.
“Had Meta bought Plaintiffs’ works in a bookstore or borrowed them from a library and trained its Llama models on them without a license, it would have committed copyright infringement,” plaintiffs’ lawyers wrote in the filing, adding that this constitutes “proof of copyright infringement.”
How I fixed 20 seconds of lag for every user in just 20 minutes.
Our AI agent was running 10-20 seconds slower than it should, impacting both our own developers and our early adopters.
Last week, I sat down with Pindrop’s CEO to discuss that intersection of AI, cybersecurity and fraud. Check it out here if you missed it.
Ahead of looming ban, TikTok creators ask fans to find them on Instagram or YouTube (CNBC).
California wildfires update: Winds intensify as the death toll rises (NBC News).
The best actually real stuff at CES 2025 (The Verge).
2024 was the hottest year on record, and the 1st to breach the 1.5 C global warming limit, data reveals (Live Science).
FTC, DOJ backs key allegations in Elon Musk’s antitrust case against Sam Altman’s OpenAI, Reid Hoffman (The New York Post).
If you want to get in front of an audience of 200,000+ developers, business leaders and tech enthusiasts, get in touch with us here.
Microsoft has filed a lawsuit against a group of cyber criminals that allegedly ran a hacking scheme that involved first compromising legit customer accounts, and then selling access to those accounts with instructions on getting around guardrails in order to produce illicit content with Microsoft’s generative AI tools.
YouTubers and other content creators are selling their unused content to AI developers for training purposes. OpenAI, Google and other developers are collectively paying hundreds of creators in individual deals that could be worth thousands each.
Accelsius and the drive for data center efficiency
Source: Accelsius
The core of the artificial intelligence sustainability problem (both economic and environmental) involves the data centers that make the tech possible, data centers filled with increasingly dense racks of GPU computer chips that are simply far more energy-intensive than their CPU counterparts.
The problem in the data center is twofold. First, there’s the simple cost (in dollars and electricity) of compute, that is, of running applications using those more intensive chips. Second, there’s the cost of cooling the chips themselves.
There are two main ways data centers traditionally explore cooling; air cooling (literally, air conditioning the chips) and liquid cooling. Both are energy-intensive, both are costly. But the complexity of GPUs — and now, the sheer quantity of them — has created a bit of a challenge; GPUs run hot, so they require more intensive cooling. Air cooling isn’t enough.
So data centers have begun turning to liquid cooling, leading, as this study pointed out last year, to the massive consumption of freshwater from local areas. But drawing in the water alone doesn’t automatically cheapen the electrical bill; that water needs to be regularly chilled and then re-chilled as it runs through the data center, meaning the cost becomes significant in water consumption in addition to electricity (though it’s more efficient than air cooling).
This cycle is one that is only growing amid the push toward a broader integration of AI, which means more GPUs to support more applications, which, in turn, means hotter data centers and more electricity expenditure.
As we wrote about recently, U.S. data centers, which in 2023 consumed some 4.4% of total electricity, are expected to consume a minimum of 12% of total electricity in the country by 2028.
“Having spent 40 years in the data center sector, there was a lot of lip service around sustainability, particularly in the old days. Basically, if they were inefficient, it helped them with revenue,” Josh Claman, CEO of Accelsius, a next-generation data center cooling company, told me. Energy inefficiencies meant data centers could charge their clients more; it didn’t matter if half of the cost was coming from cooling rather than compute. Efficiency didn’t matter.
Now, efficiency is starting to matter again, though today it has become efficiency for economic sustainability, rather than environmental.
“Just when data centers were getting serious about sustainability … AI came along,” Claman said. “If you do the math — depending on the source material — there may not be enough energy generated on Earth to provide energy for the growing AI workloads across the world. So now suddenly you have a very practical view of sustainability.”
Claman said that the cost of compute has become both so high and so desperately cultivated that developers don’t want any of their allocated electricity to be wasted on cooling, when it can instead go toward compute.
“And now their agenda sort of aligned with ours, which is, you want as efficient a data center, whatever the workload is, as you can have it,” Claman said. What we’re seeing right now is that many legacy data centers simply don’t have the advanced infrastructure to handle liquid cooling. So data centers are doing whatever it takes to “stand up these AI workloads,” and, in tandem, their energy consumption is steadily spiking.
Accelsius, founded in 2022 by Innventure, has developed a two-phase, direct-to-chip liquid cooling system that uses a dielectric fluid to better remove heat from the chips without the risk of water damage.
This system, according to Accelsius, enables 50% energy savings, which can enable an 80% reduction in carbon emissions compared to air cooling technology.
Claman told me the company was founded on the premise both that chips were going to get hotter, and that the more heat “you want to transport, the more water you need to pump through these small orifices, to the point that the solution destroys itself.”
“So if you look at Nvidia's chip roadmap, or AMD chip roadmap, we're going to be hitting north of 2000 watts per socket in the near term,” he said. “That's not distant future, that’s sort of next-generation stuff. Where it goes from there, I don't know, but it's going to get hotter, not cooler.”
Accelsius in November secured $24 million in Series A funding, money that the company will use in part to boost its international expansion.
Claman told me that, when pitching companies in Europe, Accelsius leads with sustainability and follows with the details. It’s the reverse in the U.S.
“It’s an emphasis difference,” he said. “The U.S. folks care less about sustainability. I personally think it's a bit of an indictment on our culture. But it is what it is. So we typically lead with the commercial problems.”
Southeast Asia and the Middle East, he said, are somewhere in between those two extremes.
“I feel like our timing was very good,” Claman said. “We knew that timing was right for liquid cooling. Did we know the AI revolution was just right around the corner? To the extent that it was? I don't think anyone called that, but it's certainly been a good tailwind for us.”
Which image is real? |
🤔 Your thought process:
Selected Image 1 (Left):
“The trees at an angle in Image 2 gave it away.”
Selected Image 2 (Right):
“damn! thought the snow texture in image 2 looked more real, and lighting looked weird in 1.”
💭 A poll before you go
Thanks for reading today’s edition of The Deep View!
We’ll see you in the next one.
Here’s your view on regulation in 2025:
42% of you expect AI regulation to tighten up around the world in 2025, while it’ll loosen in the U.S. 20% expect U.S. state regulations to pick up.
The rest aren’t too sure.
Something else:
“My guess is that this will vary widely - some countries will have loose regulation, and that in the first half of 2025, the US will loosen at the federal level. And that also some countries around the world and some US states (like California, Massachusetts, and other left-leaning states with high-tech industries) will model tighter AI regulation by the end of 2025/ early 2026 as a response to the initial loosening trend.”
Looser in the US:
“The fact that the new POTUS has gone on record saying that restrictions would be relaxed, kind of gives us all a clue. Elon invested heavily in Trump for a reason.”
Do you think these filings will swing the case against Meta, and possibly other developers? |