this post was submitted on 21 Aug 2025
313 points (97.9% liked)

Technology

74440 readers
2416 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] tigeruppercut@lemmy.zip 17 points 3 days ago (3 children)

The real question is why anyone would want to use more power than a regular search engine to get answers that might confidently lie to you.

[–] boor@lemmy.world 2 points 21 hours ago* (last edited 21 hours ago)

Google processes over 5 trillion search queries per year. Attaching an AI inference call to most if not all of those will increase electricity consumption by at least an order of magnitude.

Edit: using their own 0.24Wh number, that equates to 1.2 billion kWh per year, or about the equivalent of 114.3 million USA homes.

[–] mrgoosmoos@lemmy.ca 8 points 3 days ago

if it's Google that they would use us the search engine, search results are turning to shit. it just often doesn't show you the relevant stuff. The AI overview is wrong. Ads sometimes take up the entire first page of results. so I see why someone would just want to show a question into the void and get a quick response instead of having to sort through five crappy results, after filtering that down from 15 possibly relevant ones

[–] TankovayaDiviziya@lemmy.world 2 points 3 days ago* (last edited 3 days ago)

I use DuckDuckGo. I use its AI features mainly for stock projections and to search for information on company earnings release. Because when I try to search for earnings schedule by myself, I get conflicting information. DDG AI is actually pretty useful to read troves of webpages and find the relevant information for me in that regard.

[–] Armok_the_bunny@lemmy.world 160 points 4 days ago (1 children)

Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?

[–] Grimy@lemmy.world 34 points 4 days ago* (last edited 4 days ago) (6 children)

I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.

Whatever you're imagining as the impact, it's probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.

[–] boor@lemmy.world 2 points 1 day ago* (last edited 20 hours ago) (1 children)

Please show your math.

One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the average USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware elsewhere in the data center, let alone the electricity required to run the facility’s climate control.

xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.

H100 is old. State of the art GB200 NVL72 is 120kW per rack.

Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has built multiple new natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.

This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.

[–] Grimy@lemmy.world 1 points 37 minutes ago

I should have specified it was an earlier llama model. They have scaled up to more then a flight or two. You are mostly right except for how much a house uses. It's about 10,500 kW per year, you're off by a thousand. It uses in an hour about 8 hours of house time, which is still a lot though, specially when you consider musks 1 million gpus.

https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b

Their first model took 2 600 000 kwh, a plane takes about 500 000. The actual napkin math was 5 flights. I had done the math like 2 years ago but yeah, I was mistaken and should have at least specified it was for their first model. Their more recent ones have been a lot more energy intensive I think.

[–] Damage@feddit.it 71 points 4 days ago (6 children)

If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?

[–] null@lemmy.nullspace.lol 20 points 4 days ago* (last edited 4 days ago) (4 children)

Because demand for data centers is rising, with AI as just one of many reasons.

But that's not as flashy as telling people it takes the energy of a small country to make a picture of a cat.

Also interesting that we're ignoring something here -- big tech is chasing cheap sources of clean energy. Don't we want cheap, clean energy?

[–] boor@lemmy.world 1 points 1 day ago

AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.

Sir we do not make reasonable points in here, you’re supposed to hate AI irrationally and shut up.

[–] Dojan@pawb.social 12 points 4 days ago (1 children)

Sure we do. Do we want the big tech corporations to hold the reins of that though?

[–] Valmond@lemmy.world 1 points 3 days ago (1 children)

If cheap(er/better) energy is invented then that's good, why would tech corpos be able to "hold the reins" of it exclusively?

[–] Dojan@pawb.social 2 points 2 days ago (1 children)

Well, patents and what have you are a thing. I’m mostly thinking that I wouldn’t want e.g. Facebook to run any nuclear reactors or energy grids. That’s something I prefer the government does.

[–] Valmond@lemmy.world 2 points 2 days ago (1 children)

Nuclear reactors already exist, that's not new tech.

[–] Dojan@pawb.social 1 points 15 hours ago (1 children)

I’m not saying it is. I’m saying that predatory companies shouldn’t run critical infrastructure.

[–] Valmond@lemmy.world 1 points 6 hours ago

We were talking about inventing, not running though.

[–] anomnom@sh.itjust.works 15 points 4 days ago (1 children)

Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?

[–] TomArrr@lemmy.world 8 points 4 days ago

Yes, yes it did. And as far as I can tell, it's still belching it out, just so magats can keep getting owned by it. What a world

https://tennesseelookout.com/2025/07/07/a-billionaire-an-ai-supercomputer-toxic-emissions-and-a-memphis-community-that-did-nothing-wrong/

[–] Imacat@lemmy.dbzer0.com 11 points 4 days ago

To be fair, nuclear power is cool as fuck and would reduce the carbon footprint of all sorts of bullshit.

[–] finitebanjo@piefed.world 7 points 4 days ago* (last edited 4 days ago) (1 children)

Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially more power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.

So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.

[–] Valmond@lemmy.world 1 points 3 days ago (1 children)

And academia will work on that problem. It reminds me of intel processors "projected" to use kilowatts of energy, then smart people made other types of chips and they don't need 2000 watts.

[–] finitebanjo@piefed.world 1 points 2 days ago* (last edited 2 days ago) (1 children)

Academia literally got cut by more than a third and Microsoft is planning to revive breeder reactors.

You might think academia will work on the problem but the people running these things absolutely do not.

[–] Valmond@lemmy.world -1 points 2 days ago (1 children)
[–] finitebanjo@piefed.world -2 points 1 day ago

Did the EU suddenly develop a tech industry overnight or are you unaware where all the major AI companies are located?

[–] Armok_the_bunny@lemmy.world 7 points 4 days ago (1 children)

Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn't help that google has forced me to make a request to their ai every time I run a standard search.

[–] Rentlar@lemmy.ca 20 points 4 days ago (1 children)

Seriously. I'd be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.

The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.

[–] zlatko@programming.dev 5 points 4 days ago

I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?

load more comments (2 replies)
[–] fmstrat@lemmy.nowsci.com 32 points 4 days ago

I'd like to understand what this math was before accepting this as fact.

[–] taiyang@lemmy.world 9 points 4 days ago

I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we've had industrial usage of energy and water usage that isn't sustainable... almonds in CA alone are a bigger problem than AI, for instance.

Not that I'm pro-AI cause it's a huge headache from so many other perspectives, but the environmental argument isn't enough. Corpo greed is probably the biggest argument against it, imo.

load more comments (2 replies)
[–] sbv@sh.itjust.works 52 points 4 days ago (5 children)

In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.

[–] unmagical@lemmy.ml 39 points 4 days ago (10 children)

There are zero downsides when mentally associating an energy hog with "1 second of use time of the device that is routinely used for minutes at a time."

https://xkcd.com/1035/

load more comments (10 replies)
[–] DarkCloud@lemmy.world 12 points 4 days ago* (last edited 4 days ago) (2 children)

The article also mentions each enquiry also evaporates 0.26 of a milliliter of water... or "about five drops".

load more comments (2 replies)
[–] plyth@feddit.org 1 points 3 days ago

The human mind uses about 100 watt. The equivalent would be 400 questions per hour, 6 per minute or one every 10 seconds. That's close to human capacity.

load more comments (2 replies)
[–] rowrowrowyourboat@sh.itjust.works 42 points 4 days ago (1 children)

This feels like PR bullshit to make people feel like AI isn't all that bad. Assuming what they're releasing is even true. Not like cigarette, oil, or sugar companies ever lied or anything and put out false studies and misleading data.

However, there are still details that the company isn’t sharing in this report. One major question mark is the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand.

Why wouldn't they release this. Even if each query uses minimal energy, but there are countless of them a day, it would mean a huge use of energy.

Which is probably what's happening and why they're not releasing that number.

[–] the_q@lemmy.zip 17 points 4 days ago

That's because it is. This is to help fence riders feel better about using a product that factually consumes insane amounts of resources.

[–] frezik@lemmy.blahaj.zone 20 points 4 days ago (1 children)

The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010.

None of those advanced nuclear projects are yet actually delivering power, AFAIK. They're mostly in planning stages.

The above isn't all to run AI, of course. Nobody was thinking about datacenters just for AI training in 2010. But to be clear, there are 94 nuclear power plants in the US, and a rule of thumb is that they produce 1GW each. So Google is taking up the equivalent of roughly one quarter of the entire US nuclear power industry, but doing it with solar/wind/geothermal that could be used to drop our fossil fuel dependence elsewhere.

How much of that is used to run AI isn't clear here, but we know it has to be a lot.

[–] wewbull@feddit.uk 4 points 4 days ago

None of those advanced nuclear projects are yet actually delivering power, AFAIK.

...and they won't be for at least 5-10 years. In the meantime they'll just use public infrastructure and then when their generation plans fall through they'll just keep doing that.

[–] NotMyOldRedditName@lemmy.world 16 points 4 days ago (2 children)

There were people estimating 40w in earlier threads on lemmy which was ridiculous.

This seems more realistic.

[–] Valmond@lemmy.world 3 points 3 days ago

40 watt hours.

load more comments (1 replies)
[–] L0rdMathias@sh.itjust.works 16 points 4 days ago (1 children)

median prompt size

Someone didn't pass statistics, but did pass their marketing data presention classes.

Wake me up when they release useful data.

[–] jim3692@discuss.online 15 points 4 days ago (3 children)

It is indeed very suspicious that they talk about "median" and not "average".

For those who don't understand what the difference is, think of the following numbers:

1, 2, 3, 34, 40

The median is 3, because it's in the middle.

The average is 16 (1+2+3+34+40=80, 80/5=16).

load more comments (3 replies)
[–] StrangeMed@lemmy.world 13 points 4 days ago

Nice share! Mistral also shared data about one of its largest model (not the one that answer in LeChat, since that one is Medium, a smaller model, that I guess has smaller energetic requirements)

https://mistral.ai/news/our-contribution-to-a-global-environmental-standard-for-ai

[–] Rhaedas@fedia.io 12 points 4 days ago

Now do training centers, since it's obvious they are never going to settle on a final model as they pursue the Grail of AGI. I could do the exact same comparison with my local computer and claim that running a prompt only uses X amount of watts because the GPU heats up for a few seconds and is done. But if I were to do some fine tuning or other training, that fan will stay on for hours. A lot different.

load more comments
view more: next ›