• gaoshan an hour ago

    One thing it's doing is jacking up electricity rates for US States that are part of the [PJM Interconnection grid](https://en.wikipedia.org/wiki/PJM_Interconnection). It's a capacity auction price that is used to guarantee standby availability and it is [up significantly](https://www.toledochamber.com/blog/watts-up-why-ohios-electr...) at $270.43 per MW/day, which is far above prior years (~$29–58/MW/day) and this is translating to significantly higher consumer prices.

    • givemeethekeys an hour ago

      Why are consumers paying for electricity used by server farms? Why can't the electricity companies charge the server farms instead?

      Where I live, the utility company bills you at a higher rate if you use more electricity.

      • barbazoo 41 minutes ago

        Charge them more than individual consumers? Why? Let the market decide how much electricity should be. /s

      • StrangeDoctor 35 minutes ago

        I think the unit you and the article want are MW-Day of un enforced capacity UCAP, not MW/Day.

        PJM claims this will be a 1.5-5% yoy increase for retail power. https://www.pjm.com/-/media/DotCom/about-pjm/newsroom/2025-r...

        • Mistletoe 27 minutes ago

          You are paying for AI whether you want it or not. Just use it at least I guess. You have no say over anything else.

        • simonw 2 hours ago

          If I'm interpreting this right it's estimating that ChatGPT's daily energy usage is enough to charge just 14,000 electric vehicles - and that's to serve in the order of ~100 million daily users.

          • blibble 2 hours ago

            > We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence.

            what do you think the odds of this being accurate are?

            zero?

            • JimDabell an hour ago

              Why would you assume that? It’s in line with estimates that were around before he posted that article and it’s higher than Gemini. It’s a pretty unsurprising number.

              • simonw 2 hours ago

                Hard to say. Sam wrote that on June 10th this year: https://blog.samaltman.com/the-gentle-singularity

                GPT-5 came out on 7th August.

                Assuming the 0.34 value was accurate in the GPT-4o era, is the number today still in the same ballpark or is it wildly different?

                • blibble 2 hours ago

                  the "AI" industry have identified that energy usage is going to be used as a stick to beat them with

                  if I was altman then I'd release a few small numbers to try and get influencers talking about "how little energy chatgpt uses"

                  and he can never be accused of lying, as without any methodology as to how it was calculated it's unverifiable and completely meaningless

                  win-win!

                  • a_wild_dandan an hour ago

                    I would bet that it's far lower now. Inference is expensive we've made extraordinary efficiency gains through techniques like distillation. That said, GPT-5 is a reasoning model, and those are notorious for high token burn. So who knows, it could be a wash. But selective pressures to optimize for scale/growth/revenue/independence from MSFT/etc makes me think that OpenAI is chasing those watt-hours pretty doggedly. So 0.34 is probably high...

                    ...but then Sora came out.

                    • yen223 29 minutes ago

                      Yeah, something we are confident about is that

                      a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)

                      b) video generation is very likely a few orders of magnitude more expensive than text generation.

                      That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.

                      Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%

                • moralestapia 2 hours ago

                  I was about to post this exact thing.

                  Seems ... low? And it will only get more efficient going forward.

                  I don't get why this is supposed to be a big deal for infrastructure since there's definitely way more than 14,000 EVs out there and we are doing well.

                  • ares623 2 hours ago

                    the infrastructure needs to go somewhere. And that somewhere needs to have access to abundant water and electricity. It just so happens those are things humans need too.

                    Before GenAI we were on our way to optimizing this, at least to the level where the general public can turn a blind eye. It was to the point where the companies would brag about how much efficient they are. Now all that progress is gone, and we're accelerating backwards. Maybe that was all a lie too. But then who's to say the current numbers are a lie too to make the pill easier to swallow.

                    • moralestapia 2 hours ago

                      Hmm, I guess we'll have to do it the slow way ...

                      What % of EVs on the market is 14,000?

                    • azakai an hour ago

                      It is very low in terms of energy per user, yes. As the first figure in the article says, it's the same power as running an LED lightbulb for an hour - almost negligible.

                      All the later figures in the article talk about the total power for all users, and how that usage will end up concentrated in a small number of data centers. I guess that's their point?

                      But I'm still not sure how that ends up "reshaping energy and infrastructure", the subtitle of the article. If the concentration is a problem, the solution would be to build many small datacenters instead of just a few "Stargate-class" ones?

                  • geuis an hour ago

                    My thoughts.

                    Current gen AI is going to result in the excess datacenter equivalent of dark fiber from the 2000's. Lots of early buildout and super investment, followed by lack of customer demand and later cheaper access to physical compute.

                    The current neural network software architecture is pretty limited. Hundreds of billions of dollars of investor money has gone into scaling backprop networks and we've quickly hit the limits. There will be some advancements, but it's clear we're already at the flat part of the current s-curve.

                    There's probably some interesting new architectures already in the works either from postdocs or in tiny startups that will become the base of the next curve in the next 18 months. If so, one or more may be able to take advantage of the current overbuild in data centers.

                    However, compute has an expiration date like old milk. It won't physically expire but the potential economic potential decreases as tech increases. But if the timing is right, there is going to be a huge opportunity for the next early adopters.

                    So what's next?

                    • ch4s3 41 minutes ago

                      If the end result here is way overbuilt energy infrastructure that would actually be great. There’s a lot you can do with cheap electrons.

                      • riku_iki 37 minutes ago

                        I suspect it will mostly be fossil power capacity, which is much easier to scale up

                    • driverdan an hour ago

                      This is one possibility I'm assuming as well. It largely depends on how long this bubble lasts. At the current growth rate it will be unsustainable before many very large DCs can be built so it's possible the impact may not be as severe as the telecom crash.

                      Another possibility is that new breakthroughs significantly reduce computational needs, efficiency significantly improves, or some similar improvements that reduce DC demand.

                      • refulgentis an hour ago

                        It's a line (remindme! 5 years)

                      • driverdan an hour ago

                        This doesn't seem to factor in the energy cost of training which is currently a very significant overhead.

                        • blondie9x 2 hours ago

                          This doesn't include the energy for mining and chip production either. Can you imagine if it did?

                          Then when you take into account the amount of water used to cool the data centers as well as part o extraction and production process? Things get insane then https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-ac...