Proton Mail is famous for its privacy and security. The cool trick they do is that not even Proton can decode your email. That’s because it never exists on their systems as plain text — it’s always…
It’s when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it’s all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it’s going to be a rocky ride down.
I think it’s different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.
With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.
GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.
It’s probably different. The crypto bubble couldn’t actually do much in the field of useful things.
Now, I’m saying that with a HUGE grain of salt, but there are decent application with LLM (let’s not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their “AI” offers.
I kinda hope we’ll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that’s not the push they’re going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.
LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.
Yes, but at this point most specialized hardware only really work for inference. Most players are training on NVIDIA GPUs, with the primary exception of Google who has their own TPUs, but even these have limitations compared to GPUs (certain kinds of memory accesses are intractably slow, making them unable to work well for methods like instant NGP).
GPUs are already quite good, especially with things like tensor cores.
as long as certain jobs and tasks can be done easier, and searches can be done faster, its gonna stay. not a fad like nft.
the bubble here is the energy and water consumption part.
as long as certain jobs and tasks can be done easier, and searches can be done faster
I’m still waiting for somebody to prove any of these statements are true. And I say that as somebody working in a company that demands that several employees use AI - all I see is that they now take extra time manually fixing whatever bad output the LLM produced, and slowly losing their ability to communicate without first consulting ChatGPT, which is both slow and concerning.
Here’s the thing, it kind of already has, the new AI push is related to smaller projects and AI agents like Claude Code and GitHub copilot integration. MCP’s are also starting to pick up some steam as a way to refine prompt engineering. The basic AI “bubble” popped already, what we’re seeing now is an odd arms race of smaller AI projects thanks to companies like Deepseek pushing the AI hosting costs so low that anyone can reasonably host and tweak their own LLMs without costing a fortune. It’s really an interesting thing to watch, but honestly I don’t think we’re going to see the major gains that the tech industry is trying to push anytime soon. Take any claims of AGI and OpenAI “breakthroughs” with a mountain of salt, because they will do anything to keep the hype up and drive up their stock prices. Sam Altman is a con man and nothing more, don’t believe what he says.
You’re saying th AI bubble has popped because even more smaller companies and individuals are getting in on the action?
Thats kind of the definition of a bubble actually. When more and more people start trying to make money on a trend that doesn’t have that much real value in it. This happened with the dotcom bubble nearly the same. It wasn’t that the web/tech wasn’t valuable, it’s now the most valuable sector of the world economy, but at the time the bubble expanded more was being invested than it was worth because no one wanted to miss out and it was accessible enough almost anyone could try it out.
What they’re saying is that you said that the bubble has kinda already popped because (insert description of the middle of the dot com bubble here when smaller companies began to join in). Based on that, the bubble hasn’t popped at all, small companies are just able to buy in as well before the collapse hits.
The bubble will burst. BUT … the entire world will run on this new technology, nobody will imagine living without it, and multibillion dollar companies will profit and be created from it
depends on what and with whom. based on my current jobs with smaller companies and start ups? soon. they can’t afford the tech debt they’ve brought onto themselves. big companies? who knows.
How much longer until the AI bubbles pops? I’m tired of this.
It’s when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it’s all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it’s going to be a rocky ride down.
Is it like crypto where cpus were good and then gpus and then FPGAs then ASICs? Or is this different?
Wildly different, though similar in that ASIC was tuned to specific crypto tasks, everyones making custom silicon for neural nets and such.
I wouldn’t plan on it going away. Apple put optimized neural net chips in their last phone. Same with Samsung.
I think it’s different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.
With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.
GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.
Thank you for the explanation!
If bitnet or some other technical innovation pans out? Straight to ASICs, yeah.
Future smartphone will probably be pretty good at running them.
It’s probably different. The crypto bubble couldn’t actually do much in the field of useful things.
Now, I’m saying that with a HUGE grain of salt, but there are decent application with LLM (let’s not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their “AI” offers.
I kinda hope we’ll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that’s not the push they’re going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.
LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.
There’s absolutely a push for specialized hardware, look up that company called Groq !
Yes, but at this point most specialized hardware only really work for inference. Most players are training on NVIDIA GPUs, with the primary exception of Google who has their own TPUs, but even these have limitations compared to GPUs (certain kinds of memory accesses are intractably slow, making them unable to work well for methods like instant NGP).
GPUs are already quite good, especially with things like tensor cores.
We’re still in the “IT’S GETTING BILLIONS IN INVESTMENTS” part. Can’t wait for this to run out too.
as long as certain jobs and tasks can be done easier, and searches can be done faster, its gonna stay. not a fad like nft. the bubble here is the energy and water consumption part.
I’m still waiting for somebody to prove any of these statements are true. And I say that as somebody working in a company that demands that several employees use AI - all I see is that they now take extra time manually fixing whatever bad output the LLM produced, and slowly losing their ability to communicate without first consulting ChatGPT, which is both slow and concerning.
✨
Here’s the thing, it kind of already has, the new AI push is related to smaller projects and AI agents like Claude Code and GitHub copilot integration. MCP’s are also starting to pick up some steam as a way to refine prompt engineering. The basic AI “bubble” popped already, what we’re seeing now is an odd arms race of smaller AI projects thanks to companies like Deepseek pushing the AI hosting costs so low that anyone can reasonably host and tweak their own LLMs without costing a fortune. It’s really an interesting thing to watch, but honestly I don’t think we’re going to see the major gains that the tech industry is trying to push anytime soon. Take any claims of AGI and OpenAI “breakthroughs” with a mountain of salt, because they will do anything to keep the hype up and drive up their stock prices. Sam Altman is a con man and nothing more, don’t believe what he says.
You’re saying th AI bubble has popped because even more smaller companies and individuals are getting in on the action?
Thats kind of the definition of a bubble actually. When more and more people start trying to make money on a trend that doesn’t have that much real value in it. This happened with the dotcom bubble nearly the same. It wasn’t that the web/tech wasn’t valuable, it’s now the most valuable sector of the world economy, but at the time the bubble expanded more was being invested than it was worth because no one wanted to miss out and it was accessible enough almost anyone could try it out.
I literally said exactly what you’re explaining. I’m not sure what you’re trying to accomplish here…
What they’re saying is that you said that the bubble has kinda already popped because (insert description of the middle of the dot com bubble here when smaller companies began to join in). Based on that, the bubble hasn’t popped at all, small companies are just able to buy in as well before the collapse hits.
It’s like the Internet in 1998
Pets.com hasn’t gone but yet, but it will.
The bubble will burst. BUT … the entire world will run on this new technology, nobody will imagine living without it, and multibillion dollar companies will profit and be created from it
depends on what and with whom. based on my current jobs with smaller companies and start ups? soon. they can’t afford the tech debt they’ve brought onto themselves. big companies? who knows.
Time to face the facts, this utter shit is here to stay, just like every other bit of enshitification we get exposed to.