- The Deep View
- Posts
- ⚙️ Inside IBM's cutting-edge Albany research center
⚙️ Inside IBM's cutting-edge Albany research center
Here’s everything it takes to make AI tick

Good morning, and happy Friday.
Since the Studio Ghibli stuff hasn’t stopped, here are some more quotes from Miyazaki, when a researcher told him in 2016 that computers would soon be able to paint like humans: “I fear the world’s end is near. Humans have lost confidence. Hand drawing's the only answer."
— Ian Krietzberg, Editor-in-Chief, The Deep View
In today’s newsletter:
⚕️ The reality of AI and medicine
🏛️ Judge denies OpenAI’s motion to dismiss; case to proceed to trial
👁️🗨️ Here’s everything it takes to make AI tick
The realities of AI in medicine
We’ve been hearing, almost from day one (and every day since), that AI will ‘cure’ cancer.
It’s been several years since Mr. Altman started talking about that, and such a phenomenon has yet to occur.
See, the realities of AI technologies — by which I mean generative AI, language models and machine learning models — in healthcare are both highly nuanced and extremely complicated.
So we connected with several professors who are working at the cutting edge of this integration to decipher the reality of what AI is actually accomplishing for medicine today, as well as the near-term trajectory that we’re on.
This has nothing to do with ‘cures,’ magic or AGI.
It has to do with big data, advanced statistics and pattern recognition.
And the thing is, the current capability and near-term potential are as promising as they are complicated.

BOXABL round on StartEngine is closing tomorrow!
Most houses take seven months to complete.
BOXABL can put one out of the assembly line every four hours, including electrical, HVAC, and plumbing! Now, the company is raising funds and has made shares available to the public but the round is closing soon on StartEngine!
The company already has two home models and is developing a 20x30 and 20x40 home as part of their Phase 2 plan!
Currently, shares are being offered at $0.80 per share. With a $1,000 minimum investment, you can join 40,000 investors to help solve the housing crisis.
Time is running out, TOMORROW is the last day to invest in BOXABL on StartEngine.
Judge denies OpenAI’s motion to dismiss; case to proceed to trial

Source: Unsplash
A federal judge has denied OpenAI’s motion to dismiss the massive copyright lawsuit brought against the startup — and Microsoft, its biggest backer — by the New York Times.
The Times filed the case in 2023, alleging sweeping copyright infringement, violations of the Digital Millenium Copyright Act, trademark dilution and unfair competition.
The details: OpenAI had moved to dismiss claims of direct and contributory copyright infringement, in addition to DMCA violations and unfair competition.
After hearing oral arguments in January, Judge Sidney Stein did grant Microsoft and OpenAI’s motion to dismiss the Times’ claims of DMCA violations (the law holds that it is illegal to intentionally remove copyright-identification information; proving intent, here, is difficult).
But Stein denied the remainder of OpenAI’s motions for dismissal, allowing the landmark case to proceed to a jury trial.
The case has since been combined with two other class-action copyright infringement cases, brought by the Center for Investigative Reporting and the Daily News. Across the three, the judge allowed only the Daily News’ claim of DMCA violations to stand.
Stein has not yet published his reasoning behind his orders, but said he would soon.
"We are very pleased with Judge Stein’s ruling, determining that we can proceed on virtually all of our claims,” Steven Lieberman, an attorney for the plaintiffs, said in an emailed statement. “We appreciate the opportunity to present a jury with the facts about how OpenAI and Microsoft are profiting wildly from stealing the original content of newspapers across the country."
Why it matters: This is a significant — though unsruprising — blow to Microsoft and OpenAI. There are only a handful of important cases related to the copyright infringement of AI, and this is one of the more significant ones since it directly addresses that question of whether it is ‘fair use’ for developers to train commercial models on copyrighted material.
As we discussed yesterday, that remains an open question; despite this, it has been functioning as the cornerstone of OpenAI’s arguments from day one.
I will watch a jury trial here with great interest.


A Ghiblified meltdown: In the days since OpenAI launched its image generation upgrade in ChatGPT, the function went viral for its (copyright questionable) capacity to generate Studio Ghibli versions of images. But Sam Altman said that “our GPUs are melting” — OpenAI had to introduce stricter rate limits in response, with Altman saying that users in the free ChatGPT tier will get three generations per day.
A GenAI IPO: CoreWeave is set to IPO today, an offering that Nvidia will reportedly anchor at $40 per share with a $250 million order, a downsize from its initial proposed range of $47 to $55 per share. It’s an IPO that’s set to test the market’s interest in AI stocks at a time when those same stocks have been under a bit of pressure, lately.

Welcome to the semantic apocalypse; Studio Ghibli style and the draining of meaning (The Intrinsic Perspective).
Silicon Valley bubble risks heighten as investors pile into funds that bet on a single buzzy startup (CNBC).
How Meta is getting its hands on advance digital books to train its AI (Literary Hub).
Exclusive: CoreWeave scales back its IPO ambitions in hit to AI hopes (Semafor).
What you need to know about Africa’s first AI factory (Rest of World).

Hiring smarter starts with understanding salary trends.
Get global salary data on engineers, designers, product managers and more.
Explore top-tier talent with experience at AWS, Google, PwC, and beyond.
Learn how to cut hiring costs and reinvest in growth.
Inside IBM's cutting-edge semiconductor research

The Albany NanoTech Complex. Source: IBM
Correction Note: The 14th, 18th and 19th paragraphs have been updated to more clearly reflect that NY Creates owns and operates the Albany NanoTech Complex.
When I go to pull up ChatGPT, I don’t even need to type the full web address.
No.
That problem was solved a long time ago, with an earlier iteration of the algorithms I now make a living writing about: autocomplete.
So I type ‘chat’ into the top of my searchbar, click ‘enter,’ and I’m looking at a simple generative interface. The world’s knowledge — twisted through subtle hallucination, of course — at my fingertips.
What happens next isn’t much more difficult than navigating to the website.
Simple prompts. Vague ideas. Loose concepts.
And then you hit ‘enter,’ and you get a wealth of (possibly inaccurate) research, or a mini blog post complete with (possibly fake) citations, or a series of videos and images (that might not conform to the laws of physics), or perfectly (terribly) written business-casual emails.
The list goes on.
But somewhere many miles away, the lights are on at a data center lined with racks of wildly expensive computer chips that are directly making that generative environment work.
We don’t often think about it.
This isn’t unique to generative AI; the internet — and the cloud — are powered by layers of hardware: hundreds of thousands of miles of undersea cables and massive data centers lined with computer chips, chips that are dependent, of course, on tons of electricity.
But generative AI poses different challenges, and is run by different chips, chips that happen to be far more energy intensive than their predecessors. That energy demand, growing at a rate that exceeds supply, is actively weakening electric grids and worsening air pollution.
As important as the algorithms themselves are, it all comes back to the chips that make them possible.
And in Albany, at the heart of the East Coast’s answer to Silicon Valley, lies the Albany NanoTech Complex, a facility the result of a partnership between the New York Center for Research, Economic Advancement, Technology, Engineering and Science (NY CREATES) — the non-profit that owns and operates the Albany NanoTech Complex — and the state of New York, local universities and corporate partners, including IBM. This advanced semiconductor research facility is actively designing the future of chip technology.
The lab is “at the forefront of semiconductor research and development in the world,” Mukesh Khare, GM of IBM Semiconductors and head of its Albany lab, told me Monday.
Unlike other facilities, plenty of partner companies have access to the Albany research lab, which was established in the ‘90s with the intention of collectively advancing technical science while carefully protecting sensitive IP. “At the end, semiconductor is all about the ecosystem. It’s not about one company,” Khare said. “One company and one country cannot do it. We just have to work together.”
Hooded, cloaked, bootied and gloved, I had the chance to check it out.

Source: NY Creates
IBM’s work there is led, in many ways, by the same guiding focus that seems to govern the bulk of IBM’s work in AI and the cloud: efficiency. Beyond the robotic carts that glide along the ceiling to deliver semiconductor wafers from one clean room to the next, the work being done within those clean rooms — suffused, of course, in vivid yellow light, beneath the 24/7 clank and clamor of billions of dollars worth of semiconductor fabrication machines — is fundamentally about efficiency, a focus that mirrors the efforts of IBM’s research facility at Yorktown Heights.
“At the macro level, we want to make sure AI is more efficient,” Khare said. “Last time I checked, the power consumption for AI is a little high. It’s just out of control.”
IBM is pulling several different levers to achieve that goal. Khare called this “co-optimization.” It involves squeezing both the software and hardware for any possible gains in operational efficiency: “every part of the stack has room to improve,” he said.
At IBM, that means a focus on smaller, less energy-intensive language models designed to interface with customer-specific data. But it also means advanced optical technology designed to reduce the time chips spend idle within data centers, alongside advancements in and explorations of low-precision computing to more cheaply control the hardware within those data centers.
“Power and cost, those two are going to be the first things that we go after,” Khare said.

Source: IBM
And then at the chip level, IBM isn’t interested in developing chips for model training purposes — Nvidia’s GPUs are good enough for that. But it is interested in driving advancements in chips specifically built for inference, which refers to the operation of a given system.
The core thrust of this work involves IBM’s family of AIU chips, which have been in the works for years and are only just beginning to enter into use. An early installation of these chips used eight times less power than a similar cluster of GPUs optimized for training.
At the same time, Khare explained that IBM is investing heavily in chiplet technology, an approach that breaks a chip into pieces that are connected with advanced packaging technology, which enables faster processing in aggregate.
And on the cutting edge, the Albany lab is advancing research into different types of computing itself with something called analog AI. The training and inference of AI models today, like most computing, is done digitally: digital matrix multiplication and accumulation.
But an analog approach to this — like the grooves in a record — eliminates the need for data to be transferred from memory storage to a CPU processor, meaning calculations can be done far more quickly.
“We need to develop new materials that can store weights of models, and those weights get represented in the form of resistance, and then you can use the analog computation to perform multiplication functions,” Khare said. “That is so much more energy efficient … (it’s) unbelievable.”
In Albany, IBM researchers have developed that material and built a prototype that is currently being used to run models.
That’s the primary purpose of the Albany lab, to conduct and then implement research through prototypes that are fabricated, utilized and stress-tested on-site, at which point they’ll be sent to fabrication partners for mass production. The AIU chips were born, tested and prototyped in Albany, which is also where IBM’s exploration of chiplets takes place.
The analog AI chips aren’t yet ready for high-volume production, “but that’s part of my job. To keep working on it, and at some point in time, it will be ready,” Khare said.
But there’s a strange kind of problem with efficiency that was highlighted by Microsoft CEO Satya Nadella in the wake of DeepSeek’s launch of R1; the more efficient these systems get, the more they’ll be used, meaning the aggregate cost of operation will increase.
It’s a phenomenon known as Jevons’ Paradox.
“If you make a straight-line projection of how much data center energy is needed and the rate at which AI is growing, then you’ll find that within 10 years, the planet’s entire power will be consumed just by data centers,” Khare said, adding, however, that that situation likely won’t play out. It’s “not real. But it means that … we are trying to optimize in every part of the stack.”
As to whether these full-stack efficiency gains might ever get us to the point where a data center could reach beyond mere efficiency to actually operate with carbon neutrality on clean power, Khare’s not sure.
But he thinks it’ll all come down to economics.
“It’s the economics that dominate; technology adapts to that. I don’t think the world’s entire energy will be consumed by GPUs,” he said. “And I also believe that more and more sustainable energy will play a bigger role because it will make economic sense.”


Which image is real? |



🤔 Your thought process:
Selected Image 1 (Left):
“In Image 2 the saw’s powerhead is ridiculously tiny and is missing both a handle for the left hand and a kickback chain brake. The chain itself is not realistic. Shutter speed is way too high.”
Selected Image 2 (Right):
“I thought the chainsaw was too close to the bracer in Image 1. Thats how you get a broken chain and hopefully nothing else.”
💭 A poll before you go
Thanks for reading today’s edition of The Deep View!
We’ll see you in the next one.
Here’s your view on the Alexa updates:
A third of you never got an Amazon Echo for data privacy reasons, and a third of you plan to get rid of your Echo because of these same data privacy concerns.
20% don’t care.
Getting rid of my Echo:
“I like Apple's Home app better, anyway.”
What do you think about the Studio Ghibli stuff? |
If you want to get in front of an audience of 450,000+ developers, business leaders and tech enthusiasts, get in touch with us here.
Boxabl Disclosure: This is a paid advertisement for Boxabl’s Regulation A offering. Please read the offering circular here. This is a message from Boxabl