Despite the flashy title that's the first "sober" analysis from a CEO I read about the technology. While not even really news, it's also worth mentioning that the energy requirements are impossible to fulfill
Also now using ChatGPT intensely since months for all kinds of tasks and having tried Claude etc. None of this is on par with a human. The code snippets are straight out of Stackoverflow...
Octoth0rpeyesterday at 6:47 PM
> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said
This doesn't seem correct to me, or at least is built on several shaky assumptions. One would have to 'refill' your hardware if:
- AI accelerator cards all start dying around the 5 year mark, which is possible given the heat density/cooling needs, but doesn't seem all that likely.
- Technology advances such that only the absolute newest cards can be used to run _any_ model profitably, which only seems likely if we see some pretty radical advances in efficiency. Otherwise, it seems like assuming your hardware is stable after 5 years of burn in, you could continue to run older models on that hardware at only the cost of the floorspace/power. Maybe you need new cards for new models for some reason (maybe a new fp format that only new cards support? some magic amount of ram? etc), but it seems like there may be room for revenue via older/less capable models at a discounted rate.
myaccountonhnyesterday at 6:46 PM
> In an October letter to the White House's Office of Science and Technology Policy, OpenAI CEO Sam Altman recommended that the US add 100 gigawatts in energy capacity every year.
> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said.
And people think the climate concerns of AI are overblown. Currently US has ~1300 GW of energy capacity. That's a huge increase each year.
mbreeseyesterday at 7:22 PM
I would add an addendum to this -- there is no way the announced spending on AI data centers will all come to fruition. I have no doubt that there will be a massive build-out of infrastructure, but it can't reach the levels that have been announced. The power requirements alone will stop that from happening.
m101today at 2:07 PM
To all those people that think IBM don't know anything. Calculate this number:
# companies 100+ years old / # companies ever existed in 100+ years
Then you will see why IBM is pretty special and probably knows what they are doing.
PeterStuertoday at 7:29 AM
If AI is a highlander market, then the survivor will be able to eventually aquire all those assets on the cheap from the failing competitors that flush their debt in bankruptcy.
Meanwhile, highlander hopefuls are spending other peoples money to compete. Some of them with dreams of not just building a tech empire, but to truly own the machine that will rule the world in every aspect.
Investors are keen on backing the winner. They just do not know yet who it will be.
badmonsteryesterday at 7:15 PM
He's right to question the economics. The AI infrastructure buildout resembles the dot-com era's excess fiber deployment - valuable long-term, but many individual bets will fail spectacularly. Utilization rates and actual revenue models matter more than GPU count.
ic_fly2yesterday at 6:52 PM
IBM might not have a data strategy or AI plan but he isnāt wrong on the inability to generate a profit.
A bit of napkin math:
NVIDIA claims 0.4J per token for their latest generation
1GW plant with 80% utilisation can therefore produce 6.29 10^16 tokens a year.
There are ~10^14 tokens on the internet. ~10^19 tokens have been spoken by humans⦠so far.
1970-01-01today at 1:25 AM
There's really 3 fears going on:
1. The devil you know (bubble)
2. The devil you don't (AI global revolution)
3. Fear of missing out on devil #2
I don't think IBM knows anything special. It's just more noise about fear1 & fear3.
skeeter2020yesterday at 6:57 PM
The interesting macro view on what's happening is to compare a mature data center operation (specifically a commoditized one) with the utility business. The margins here, and in similar industries with big infra build-out costs (ex: rail) are quite small. Historically the businesses have not done well; I can't really imagine what happens when tech companies who've only ever known huge, juicy margins experience low single digit returns on billions of investment.
bluGillyesterday at 6:42 PM
I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)
stevenjgarnertoday at 1:06 AM
"It is 1958. IBM passes up the chance to buy a young, fledgling company that has invented a new technology called xerography. Two years later, Xerox is born, and IBM has been kicking themselves ever since. It is ten years later, the late '60s. Digital Equipment DEC and others invent the minicomputer. IBM dismisses the minicomputer as too small to do serious computing and, therefore, unimportant to their business. DEC grows to become a multi-hundred-million dollar corporation before IBM finally enters the minicomputer market. It is now ten years later, the late '70s. In 1977, Apple, a young fledgling company on the West Coast, invents the Apple II, the first personal computer as we know it today. IBM dismisses the personal computer as too small to do serious computing and unimportant to their business." - Steve Jobs [1][2][3]
Now, "IBM CEO says there is 'no way' spending on AI data centers will pay off". IBM has not exactly had a stellar record at identifying the future.
A decade ago, IBM was spending enormous amounts of money to tell me stuff like "cognitive finance is here" in big screen-hogging ads on nytimes.com. They were advertising Watson, vaporware which no one talks about today. Are they bitter that someone else has actually made the AI hype take off?
mathattacktoday at 1:12 AM
Interesting to hear this from IBM, especially after years of shilling Watson and moving from being a growth business to the technology audit and share buyback model.
zeckalphatoday at 1:09 AM
Reminds me of all the dark fiber laid in the 1990s before DWDM made much of the laid fiber redundant.
If there is an AI bust, we will have a glut of surplus hardware.
Archelaostoday at 4:23 AM
Gartner estimates that worldwide AI spending will total 1.5 Trillion US$ in 2025.[1] As of 2024, global GDP per year is 111.25 Trillion US$.[2] The question is how much this can be increased by AI. This describes the market volumn for AI. Todays investments have a certain lifespan, until they become obsolet. For custom software I would estiamte that it is 6-8 years. AI investments should be somewhere in this range.
Taking all this into consideration, the investment volumn does not look oversized to me -- unless one is quite pessimistic about the impact of AI on global GDP.
Coming from the company that missed on consumer hardware, operating systems, and cloud.
He might be right but IBM isn't where Iād look for guidance on what will pay off.
6thbittoday at 1:41 PM
āI think there is a world market for maybe five computers.ā
Letās hope IBM keeps their streak of bad predictions.
kopirgantoday at 1:11 PM
What's the legal status of all this AI code?! Will it be likely that someone whose code was lifted as part of "learning" can sue?!
kopirgantoday at 1:09 PM
Its also strange that while solar systems built wait years for grid capacity, this much extra energy is being planned..
criddellyesterday at 6:51 PM
> But AGI will require "more technologies than the current LLM path," Krisha said. He proposed fusing hard knowledge with LLMs as a possible future path.
And then what? These always read a little like the underpants gnomes business model (1. Collect underpants, 2. ???, 3. Profit). It seems to me that the AGI business models require one company has exclusive access to an AGI model. The reality is that it will likely spread rapidly and broadly.
If AGI is everywhere, what's step 2? It seems like everything AGI generated will have a value of near zero.
turtleyachttoday at 11:55 AM
Data centers house hardware, and it's a land grab for compute. What actually runs post-AI depends on its owners. A glut of processing might be spent reverse-engineering efficient heuristics versus the "magic box."
A bitter summer.
pjdesnoyesterday at 8:50 PM
> $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest
That assumes you can just sit back and gather those returns indefinitely. But half of that capital expenditure will be spent on equipment that depreciates in 5 years, so you're jumping on a treadmill that sucks up $800M/yr before you pay a dime of interest.
kenjacksonyesterday at 6:36 PM
I don't understand the math about how we compute $80b for a gigawatt datacenter. What's the costs in that $80b? I literally don't understand how to get to that number -- I'm not questioning its validity. What percent is power consumption, versus land cost, versus building and infrastructure, versus GPU, versus people, etc...
nnurmanovtoday at 4:50 AM
I agree. Here is my thinking.
What if LLM providers will make short answers the default (for example, up to 200 tokens, unless the user explicitly enables āverbose modeā). Add prompt caching and route simple queries to smaller models.
Result: a 70%+ reduction in energy consumption without loss of quality.
Current cost: 3ā5 Wh per request. At ChatGPT scale, this is $50ā100 million per year in electricity (at U.S. rates).
In short mode: 0.3ā0.5 Wh per request. That is $5ā10 million per year ā savings of up to 90%, or 10ā15 TWh globally with mass adoption. This is equivalent to the power supply of an entire country ā without the risk of blackouts.
This is not rocket science ā just a toggle in the interface and I believe, minor changes in the system prompt. It increases margins, reduces emissions, and frees up network resources for real innovation.
And what if EU/California enforces such mode? This will greatly impact DC economy.
winddudetoday at 3:44 AM
8T is the high-end of the McKinsey estimate that is 4-8T, by 20230. That includes non-AI data-centre IT, AI data-centre, and power infrastructure build out, also including real estate for data centres.
Not all of it would be debt. Google, Meta, Microsoft and AWS have massive profit to fund their build outs. Power infrastructure will be funded by govts and tax dollars.
kolibertoday at 8:28 AM
NOTE: People pointed out that it's $800 billion to cover interest, not $8 billion, as I wrote below. My mistake. That adds 2 more zeroes to all figures, which makes it a lot more crazy. Original comment below...
$8 billion / US adult adult population of of 270 million comes out to about $3000 per adult per year. That's only to cover cost of interest, let alone other costs and profits.
That sounds crazy, but let's think about it...
- How much does an average American spend on a car and car-related expenses? If AI becomes as big as "cars", then this number is not as nuts.
- These firms will target the global market, not US only, so number of adults is 20x, and the average required spend per adult per year becomes $150.
- Let's say only about 1/3 of the world's adult population is poised to take advantage of paid tools enabled by AI. The total spend per targetable adult per year becomes closer to $500.
- The $8 billion in interest is on the total investment by all AI firms. All companies will not succeed. Let's say that the one that will succeed will spend 1/4 of that. So that's $2 billion dollar per year, and roughly $125 per adult per year.
- Triple that number to factor in other costs and profits and that company needs to get $500 in sales per targetable adult per year.
People spend more than that on each of these: smoking, booze, cars, TV. If AI can penetrate as deep as the above things did, it's not as crazy of an investment as it looks. It's one hell of a bet though.
scrootyesterday at 6:51 PM
As an elder millennial, I just don't know what to say. That a once in a generation allocation of capital should go towards...whatever this all will be, is certainly tragic given current state of the world and its problems. Can't help but see it as the latest in a lifelong series of baffling high stakes decisions of dubious social benefit that have necessarily global consequences.
zobzutoday at 2:20 AM
Also IBM: we are fully out of the AI race, btw.
Also IBM: we're just an offshoring company now anyway.
So yeah.
Animatsyesterday at 7:21 PM
How much has actually been spent on AI data centers vs. amounts committed or talked about? That is, if construction slows down sharply, what's total spend?
ojrtoday at 4:27 AM
As long as the dollar remains the reserve currency of the world and US retains its hegemony, a lot of the finances will work itself out, the only threat to the US empire crumbling is by losing a major war or extreme civil unrest and that threat is astronomically low. The US is orders of magnitude stronger than the Roman Empire, I don't think people realize the scale or control.
deletedtoday at 11:05 AM
1vuio0pswjnm7today at 12:10 AM
One thing we saw with the dot-com bust is how certain individuals were able to cash in on the failures, e.g., low cost hardware, domain names, etc. (NB. prices may exceed $2)
Perhaps people are already thinking about they can cash in on the floor space and HVAC systems that will be left in the wake of failed "AI" hype
boxedemptoday at 3:10 AM
Nobody really knows the future. What were originally consumer graphics expansion cards turned out useful in delivering more compute than traditional CPUs.
Now that compute is being used for transformers and machine learning, but we really don't know what it'll be used for in 10 years.
It might all be for naught, or maybe transformers will become more useful, or maybe something else.
'no way' is very absolute. Unlikely, perhaps.
jstummbilligtoday at 7:58 AM
Well, at least it tells us something about the sentiment on hn that a lame insight around self admitted "napkin math" and obvious conflict of interest garners 400 points.
pharos92today at 12:36 AM
I find it disturbing how long people wait to accept basic truths, as if they need permission to think or believe a particular outcome will occur.
It was quite obvious that AI was hype from the get-go. An expensive solution looking for a problem.
The cost of hardware. The impact on hardware and supply chains. The impact to electricity prices and the need to scale up grid and generation capacity. The overall cost to society and impact on the economy. And that's without considering the basic philosophical questions "what is cognition?" and "do we understand the preconditions for it?"
All I know is that the consumer and general voting population loose no matter the outcome. The oligarchs, banking, government and tech-lords will be protected. We will pay the price whether it succeeds or fails.
My personal experience of AI has been poor. Hallucinations, huge inconsistencies in results.
If your day job exists within an arbitrary non-productive linguistic domain, great tool. Image and video generation? Meh. Statistical and data-set analysis. Average.
littlecranky67today at 10:26 AM
There are so many CEOs, tech experts, financial analysts and famous investors who say we are in an AI bubble - even AI-invested companies say that about themselves. My latest favorite "We are in an AI bubble" comment comes from Linus Torvalds himself in the video with Linus from Linus Tech Tipps [0]
thats like boeing telling us we shouldnt build rockets
simianwordstoday at 6:52 AM
If it is so obvious that it wonāt pay off, why is every company investing in it? What alpha do you have that they donāt?
liampullestoday at 12:07 PM
Hypothetically speaking, if the AI hype bubble pops (or just returns to normalcy), would it be profitable to retarget the compute towards some kind of crypto mining? If so, could we expect the cryptocurrency supply to soar and the price to tank in short succession?
weerfeegleemtoday at 1:35 PM
One more data center and we will reach AGI, we promise, just give us more money.
eitallyyesterday at 7:05 PM
At some point, I wonder if any of the big guys have considered becoming grid operators. The vision Google had for community fiber (Google Fiber, which mostly fizzled out due to regulatory hurdles) could be somewhat paralleled with the idea of operating a regional electrical grid.
nashashmiyesterday at 7:41 PM
Donāt worry. The same servers will be used for other computing purposes. And maybe that will be profitable. Maybe it will be beneficial to others. But This cycle of investment and loss is a version of distribution of wealth. Some benefit.
The banks and loaners always benefit.
RobRiveratoday at 3:41 AM
What kind of reporte does the CEO of IBM expect the general technology workforce to hold for them?
maxgluteyesterday at 6:49 PM
How long can ai gpus stretch? Optmistic 10 years and we're still looking at 400b+ profit to cover interests. The factor in silicon is closer to tulips than rail or fiber in terms of depreciated assets.
Aperockytoday at 7:43 AM
LLMs at current utility do not justify this spending, but the offside chance that someone will hit AGI is likely worth the expectation.
bluGillyesterday at 6:40 PM
This is likely correct overall, but it can still pay off in specific cases. However those are not blind investments they are targeted with a planned business model
wmfyesterday at 6:38 PM
$8T may be too big of an estimate. Sure you can take OpenAI's $1.4T and multiply it by N but the other labs do not spend as much as OpenAI.
bytesandbitstoday at 2:19 AM
Mind you IBM makes +7B from keeping old school enterprise hooked up on 30 plus year old tech like z/OS and Cobol and their own super outdated stack. their AI division is frankly embarrassing. of course they would say that. IBM is one of the most conservative, anti-progress leaches in the entire tech industry. I am glad they are missing out big time on the AI gold rush. to me if anything this is a green signal.
Ekarosyesterday at 8:24 PM
How much of Nvidias price is based on 5 year replacement cycle? If that stops or slows with new demand could it also affect things? Not that 5 years does not seem very long horizon now.
parapatelsukhyesterday at 6:46 PM
The spending will be more than paid off since the taxpayer is the lender of last resort
There's too many funny names in the investors / creditors
a lot of mountains in germany and similar ya know
ta9000today at 7:53 AM
āIt doesnāt even have a keyboard!ā energy.
matt_syesterday at 9:45 PM
There is something to be said about what the ROI is for normal (i.e. non AI/tech) companies using AI. AI can help automate things, robots have been replacing manufacturing jobs for decades but there is an ROI on that which I think is easier to see and count, less humans in the factory, etc. There seems to be a lot of exaggerated things being said these days with AI and the AI companies have only begun to raise rates, they won't go down.
The AI bubble will burst when normal companies start to not realize their revenue/profit goals and have to answer investor relations calls about that.
rmoriztoday at 2:06 AM
The second buyer will make truckloads of money, remember the data center and fiber network liquidation of 2001+ - smart investors collected the overcapacity and after a couple of years the money printer worked. This time it will be the same, only the single purpose hardware (LLM specific GPUs) will probably end on a landfill.
matt-ptoday at 1:58 AM
Unless we get AGI.
jmclnxyesterday at 7:08 PM
I guess he is looking directly at IBM's cash cow, the mainframe business.
But, I think he is correct, we will see. I still believe AI will not give the CEOs what they really want, no or very cheap labor.
qwertyuiop_yesterday at 6:42 PM
The question no one seems to be answering is what would be the EOL for these newer GPUs that are being churned out of NVDIA ? What % annual capital expenditures is refresh of GPUs. Will they be perpetually replaced as NVIDIA comes up with newer architectures and the AI companies chase the proverbial lure ?
westurnertoday at 4:14 AM
Ctrl-F this thread for terms like: cost, margin
Is transistor density cost still the limit?
Cost model, Pricing model
What about more recyclable chips made out of carbon?
What else would solve for e.g. energy efficiency, thermal inefficiency, depreciation, and ewaste costs?
cmrdporcupinetoday at 3:31 AM
The investors in these companies and all this infrastructure are not so much concerned with whether any specific companies pays off with profits, necessarily.
They are gambling instead that these investments pay out it in a different way: by shattering high labour costs for intellectual labour and de-skilling our profession (and others like it) -- "proletarianising" in the 19th century sense.
Thereby increasing profits across the whole sector and breaking the bargaining power (and outsized political power, as well) of upper middle class technology workers.
Put another way this is an economy wide investment in a manner similar to early 20th century mass factory industrialization. It's not expected that today's big investments are tomorrow's winners, but nobody wants to be left behind in the transformation, and lots of political and economic power is highly interested in the idea of automating away the remnants of the Alvin Toffler "Information Economy" fantasy.
m3kw9today at 2:44 AM
Says the guy missing out on it
deletedtoday at 1:19 AM
ninjaatoday at 12:09 AM
What does Jim Cramer have to say?
devmoryesterday at 7:05 PM
I suppose it depends on your definition of "pay off".
It will pay off for the people investing in it, when the US government inevitably bails them out. There is a reason Zuckerberg, Huang, etc are so keen on attending White House dinners.
It certainly wont pay off for the American public.
wtcactustoday at 6:52 AM
The same IBM that lost all races in the last 40 years? That IBM?
m00dytoday at 5:45 AM
He will get assasinated or fired.
thenthenthentoday at 5:43 AM
the mining industry enters the chat
BenFranklin100today at 4:00 AM
A lot of you guys in the AI industry are going to lose your jobs. LLM and prompt āengineeringā experts wonāt be able to score an AI job paying as well as a barista.
oxqbldpxoyesterday at 7:29 PM
FB playbook. Act (spend) then say sorry.
sombragristoday at 12:16 AM
"yeah, there's no way spending in those data centers will pay off. However, let me show you this little trinket which runs z/OS and which is exactly what you need for these kinds of workloads. You can subscribe to it for the low introductory price of..."
deletedtoday at 1:32 PM
bmaddumatoday at 12:43 AM
No wonder why he is saying that, they lost AI game, no top researcher wants to work for IBM. Spent years developing Watson, it is dead. I believe this is a company that should not be existed.
verdvermyesterday at 6:16 PM
IBM CEO is steering a broken ship and it's not improved course, not someone who's words you should take seriously.
1. The missed the AI wave (hired me to teach watson law only to lay me off 5 wks later, one cause of the serious talent issues over there)
2. They bought most of their data center (companies), they have no idea about building and operating one, not at the scale the "competitors" are operating at