• mattlondon 13 hours ago

    Which enterprises will seriously be rolling their own and not just using cloud?

    All the clouds offer sovereignty controls now IIRC. If your data is so sensitive that it can't even leave the building, then you're probably going to have a bad time using it with your own home-grown AI solutions due to hallucinations.

    • croes 12 hours ago

      Thanks to the CloudAct there is no sovereignty.

      Why are homegrown hallucinations worse than cloud hallucinations?

      • mattlondon 9 hours ago

        Less training than someone like Google that has ~everything and are pouring billions into research.

        I don't think anyone apart from a FAANG-like corp really stands a chance in keeping up with SOTA.

        • formerphotoj 9 hours ago

          And state actors, somewhat.

      • undefined 12 hours ago
        [deleted]
        • crimsoneer 13 hours ago

          yeah, this whole article makes no sense. Nobody is going to be running their own datacentres, everyone is just going to be running it on their existing aws/gcp/azure.

          • philipwhiuk 12 hours ago

            Depends how much AI you're doing. At some point cloud no longer makes economic sense. Additionally Finance needs to co-locate compute for latency reasons meaning there's a limit to what you can off-load to the cloud.

            • add-sub-mul-div 12 hours ago

              And the respective cloud providers will absorb the increased cost of this power usage forever?

          • sebazzz 13 hours ago

            Its pretty ironic though. Last few years were focused on reducing CO2 output by reducing power usage etc. Now with all the AI hype we suddenly don't hear so much about that.

            • kjkjadksj 12 hours ago

              They still talk efficiency on one end of the mouth. Now people are buying macbooks with what 24 cores to check email on a website for work. All because that macbook now trains models in the background for reasons that clearly help this average user and their average use case. I will say as an owner of one of these macbooks that I appreciate finally having a computer with a full workday of battery life. But still. 24 cores to run the same sort of workflow that was established 25 years ago. 2025 word still does exactly what word 98 did. Same email. Same excel. Same editors and IDEs. But of course an order of magnitude more resources squandered to do so. And we tell ourselves that we need such powerful laptops because probably 1% of users actually insist on pushing this laptop to its limits instead of just buying an unsexy desktop or renting server. When really we could have all gotten by still with the dual core macbooks I expect for most peoples use cases. I had one of those last 12 years myself before an unfortunate water spill.

              • bunderbunder 12 hours ago

                That 24 core MacBook probably also consumes at least an order of magnitude less power while checking email than the 1-core computer you had 25 years ago.

                Heck, that MacBook probably consumes less power with all 24 cores blazing than the 1-core computer did just to check email. Modern notebook CPUs do a really good job of leaving cores that aren't currently needed in power save mode. That's what makes it possible for the battery to last for a full workday. Despite possibly also having less capacity than typical laptop batteries from 25 years ago.

                • pjturpeau 12 hours ago

                  And I still own my Core 2 Quad q9650 which will soon move from Windows 10 to some Linux distro to continue support side activities like children access to internet for homeworks, or alternative testbed machine.

                  • undefined 8 hours ago
                    [deleted]
                • oytis 12 hours ago

                  But once we have AGI it will solve climate change forever. Probably.

                  • smallmancontrov 12 hours ago

                    By killing all the humans? Or by setting up nice enclaves for the owners and keeping the rest of in squalor with a swarm of murderbots?

                    • PittleyDunkin 12 hours ago

                      ...if AGI is even a meaningful concept. I'm deeply skeptical.

                      • exe34 12 hours ago

                        do you not believe human level intelligence is possible?

                        • marcosdumay 11 hours ago

                          Human level intelligence is clearly not enough to solve global warming all by itself just by existing.

                          What people mean when they say "AGI" is something different.

                          • exe34 10 hours ago

                            > What people mean when they say "AGI" is something different.

                            what do they mean?

                      • anticorporate 12 hours ago

                        I hope this is /s

                    • FloorEgg 12 hours ago

                      Solving CO2 output by reducing power usage never made any sense. Anyone with a zoomed out view on human history and technology trajectories would have had an intuition that our energy needs will continue to grow on an exponential curve indefinitely.

                      And before anyone says it can't grow forever, it's unsustainable, etc... you're not thinking big enough. Just google kardashev scales.

                      If your point is that "they used to say x, now they say y", my rebuttal is that 1) the exact same person/people? 2) it's common for people to be mistaken and then learn something new, and change their opinion. It might not happen in the middle of an internet argument, but it does happen, a lot.

                      • add-sub-mul-div 12 hours ago

                        I just invented a scale that goes three stages further than the Kardashev scale, so we're in an even better position than you think!

                        • kjkjadksj 12 hours ago

                          The thing is the human experience regarding tech is not very linear in this regard. Take my life, strip out the smartphone and it is the exact same experience as the 90s in regards to what I experience and my quality of life. Strip out the desktop and its the same as the 1950s perhaps. Car, home, grocery store, basically all the same and encompasses all of life right there.

                          So it seems to me, given that much of life outside that distracting cellphone in your pocket is the same in terms of amenities affordances and available goods as it was in the 1950s, that there ought to be some efficiency gains made along the 75 years since. Then again maybe those efficiency gains are swallowed by more people globally being elevated from that great depression sort of desperate life we saw in this country in the 1930s and elsewhere today globally, into the prototypical 1950s lifestyle where a job begets housing, transport in some form, and the modern experience of stocked grocery stores of all goods instead of having to homestead or contribute to a village system for limited goods.

                          • ajmurmann 12 hours ago

                            This seems way off. Medical advancements have grown significantly since the 90s. Minimally invasive surgery was just starting. We've made huge progress at eradicating several diseases. HIV treatment and prevention has advanced leaps and bounds. Outside the US traffic fatalities dropped significantly (somehow in the US they went back up). In general the level of assistance and safety features you get in a modern car would have blown everyone's mind in the 90s. Fuel efficiency is way up. Induction stoves are so much better than electric stoves from the 90s without messing up your house's air like a gas stove. Medical imaging has made incredible breakthroughs. Innovations in shipping and logistics have enabled us to get more interesting food items like third wave coffee. We can watch the movies and tv shows we want and when we want rather than VHS recordings or whatever is on tv right now. Due to higher resolution we can actually have sports broadcasts with useful data. Many houses now have solar cells and batteries and pay little or nothing for their electricity. Amenities in houses are much nicer now.

                            I'm sure I could go on and on with this for much longer. It's easy to not see our progress over time even though it's tremendous.

                          • logicchains 12 hours ago

                            >Solving CO2 output by reducing power usage never made any sense.

                            Exactly. Any society that seriously adopted it and deliberately deindustralised would eventually just get taken over by a more industrialised neighbour.

                        • grajaganDev 14 hours ago

                          There is not enough water or power to support the current AI stock prices.

                          • crimsoneer 13 hours ago

                            It's worth recognising that a year ago gpt-4 was the cutting edge, and today you can now run a gpt-4 class model on a macbook pro (with lots of ram, sure).

                            • EVa5I7bHFq9mnYK 12 hours ago

                              A year ago gpt-4 was the cutting edge, the new cutting edge model takes 11.2 MWh of energy to answer a single question (enough to power a home for 4 years).

                              • gopher_space 12 hours ago

                                It's interesting to look at how hardware requirements change alongside acceptable response time. Knowing when you need a reply seems like it might save a lot of money.

                                • PittleyDunkin 12 hours ago

                                  Yea but gpt-4 is also only very slightly marginally useful in terms of productivity as it stands today. I highly doubt we're going to get returns on this without huge structural changes to how chatbots work.

                                  • crimsoneer 12 hours ago

                                    Sure, I think that's fair - the tech is only one part of the whole equation. Just pointing out that power demands are really quite a small (and actually quite surmountable) part of the problem.

                                • agilob 12 hours ago

                                  The executive order talks about building new geothermal power plants to support AI infra

                                  • PittleyDunkin 12 hours ago

                                    How about we cut the AI bit and just get a nice C02 decrease, i.e. the actual thing that will support future humans? THat's one more coal plant we can shut down.

                                    • grajaganDev 5 hours ago

                                      What about the water?

                                  • bokohut 11 hours ago

                                    Can we proactively speculate that as businesses port to use artificial intelligence at an ever increasing rate in time they will remove human roles in their short term efforts to decrease business costs? In reality however over the longer term they will just be locking themselves into a greater expense of energy, either indirect from cloud use or directly when buying from a centralized energy provider. With salaried employees a business can “excuse away" their raise or fire them for another person that will take equal pay however one must pay their cloud/energy bill to operate tomorrow. Are we as a species to be unintelligent and artificially believe that prices will not continue to rise, most certainly for energy? Interesting immediate times as the technology Kool-A.i.d propagates with little regard for the long term impacts.

                                    • nbuujocjut 13 hours ago

                                      Will most organisations be building out their own AI Infrastructure? I would guess in most cases they would either use APIs to existing models (OpenAI etc) or use cloud providers where they need to run and train their own models. Is there a risk that cloud providers cannot meet the demand?

                                      • pavel_lishin 13 hours ago

                                        Certain companies have very strict data handling requirements, and can't just send data off-site.

                                        • cptskippy 12 hours ago

                                          Yes, and those companies are a minority. I wouldn't be surprised if AI providers developed solutions for them. Almost immediately we saw tinybox emerge, old Tesla cards doubled and tripled in price on eBay, and Nvidia announced Digits so there's definitely interest in offline solutions.

                                        • undefined 13 hours ago
                                          [deleted]
                                          • PittleyDunkin 12 hours ago

                                            > Is there a risk that cloud providers cannot meet the demand?

                                            Waste? Cloud providers are super expensive and can't generally justify this expense with the benefits of outsourcing.

                                          • mlepath 13 hours ago

                                            All companies that are doing something with the current iteration of "AI" are underwater. Sam Altman says that the pro tier of ChatGPT is losing money, Adobe is losing tons on firefly, ... This is pretty typical for silicon valley though, we always burn investor money to corner the market and then tech usually catches up. Most enterprises don't need to be first adopters.

                                            • DeepYogurt 13 hours ago

                                              > This is pretty typical for silicon valley though, we always burn investor money to corner the market and then tech usually catches up.

                                              Yes though the cost breakdown has traditionally been large upfront development costs and low to moderate running costs. This time around the running costs are astronomical and Moore's law ain't what it used to be.

                                              • enragedcacti 12 hours ago

                                                It seems like a repeat of Uber's play but at 10x scale. Lose 10s of billions of dollars scaling a product that loses you money on every sale in the hopes that it will position you well for the massive disrupter of [self-driving | AGI]. Uber's play didn't shake out so now they are very very slowly digging out of a 30bn hole. I guess its just a question of how big the AI hole gets before they either make AGI or give up and start shoveling.

                                                • marcosdumay 11 hours ago

                                                  While there is no reasonable explanation for why Uber can't just turn a profit (and it looks like they have for the last few years), deep-learning models have very hard physical constraints on how much they cost.

                                                  • enragedcacti 7 hours ago

                                                    I would argue that physical cars driven by people have much harder constraints on cost than LLMs which can see huge cost savings (for the same performance model) as hardware improves. I agree that they aren't perfect parallels but in principle there's nothing stopping AI companies from massively cutting R&D and raising prices until marginal revenue is positive, it just would mean accepting not getting "take over the world" level profitability or getting run out of town by someone willing to keep burning money.

                                              • dehrmann 13 hours ago

                                                > we always burn investor money to corner the market and then tech usually catches up

                                                This is most R&D. You research, build a prototype, bring it to market, and it only hits profitability at volume.

                                                • skywhopper 12 hours ago

                                                  Sam Altman’s admission that ChatGPT Pro loses money was about operating profits, not including the R&D that went into it.

                                                • ForHackernews 13 hours ago

                                                  Ah, the WeWork strategy...

                                                  • __loam 13 hours ago

                                                    "We don't make money off the $200/mo option" is embarrassing.

                                                  • ForHackernews 13 hours ago

                                                    Don't worry, they can just convert all those Bitcoin mining racks to AI. Er, wait... what do you mean it only does SHA256?

                                                    • the_sleaze_ 13 hours ago

                                                      Bitcoin miners are purchased with full expectation of a life-span and an ROI within that, just like all data centers. Not sure why you're bringing this up anyways.

                                                      • add-sub-mul-div 12 hours ago

                                                        Same reason we teach history.

                                                      • EVa5I7bHFq9mnYK 12 hours ago

                                                        Bitcoin miners provide a valuable commodity - secure and unrestricted storage and transfer of value. AI provides ugly cat pictures, bad code and a lot of spam.

                                                        • janderson215 11 hours ago

                                                          Yeah, AI has never produced anything of value and nobody in the crypto world has ever scammed anybody.

                                                          • EVa5I7bHFq9mnYK 11 hours ago

                                                            Yeah, careful around this internetz thing - someone scammed somebody over there, I've heard.

                                                      • codingwagie 13 hours ago

                                                        What is it with society producing content that predicts the downfall of some new trend

                                                        • AnimalMuppet 13 hours ago

                                                          "If something cannot go on forever, it will stop." - Herb Stein

                                                          Point is, we can look at the situation, and recognize "that can't go on forever". And then we can say so, sometimes in public places. And there's nothing wrong with that.

                                                        • Vox_Leone 12 hours ago

                                                          Decentralizing computation [making it mostly on-premises again] could potentially mitigate some of the problems the article points out, in several ways:

                                                          *Localizing Power Usage*: By moving some computational tasks to local systems (on-premises), you could reduce the reliance on large, centralized data centers, which are often constrained by power and cooling limitations. This would also reduce the need for extensive retrofitting of data centers to support higher power densities.

                                                          *Efficiency Gains*: Decentralized computation, especially for non-essential services, could be more energy-efficient on a per-unit basis. Local infrastructure might allow for better optimization of energy use. Smaller, distributed systems could have specialized power and cooling needs that are more manageable and tailored to the task at hand.

                                                          *Reduced Data Transmission and Latency*: With computing spread out across local or edge-based facilities, energy costs could be further reduced by minimizing the need for long-distance data transmission, which itself requires significant energy, especially with AI workloads.

                                                          • sobellian 12 hours ago

                                                            I usually have the opposite intuition, that centralizing gives greater economies of scale. If a DC could be made more efficient by splitting it into N micro-DCs, then couldn't AWS just do that? That they do not seems telling. AFAICT centralization offers strictly more freedom in the solution space than if everyone rolls their own micro-DC.