It also pollutes the mind of ignorant people with misinformation. Not that that is anything new. But I do think objective truth is very important in a democratic society. It reminds me of that video that used to go around that showed Sinclair Broadcasting in like 20 some different 'local' broadcast news all repeating the same words verbatim. It ended with 'This is extremely dangerous to our democracy'. With AI being added to all the search engines, it is really easy to look something and unknowingly get bombarded with false info pulled out of the dregs of internet. 90% of people don't verify the answer to see if it is based in reality.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I stopped l, not that I used it that much, about 5 months ago.
The emoji usage, heading & bold text pattern makes me certain the article was written using AI.
Makes me wonder what they are doing to reach these figures.
Because I can run many models at home and it wouldn't require me to be pouring bottles of water on my PC, nor it would show on my electricity bill.
Well, most of the carbon footprint for models is in training, which you probably don't need to do at home.
That said, even with training they are not nearly our leading cause of pollution.
Article says that training o4 required equalivent amount of energy compared to powering san francisco for 3 days
Most of these figures are guesses along a spectrum of "educated" since many models, like ChatGPT, are effectively opaque to everyone and we have no idea what the current iteration architecture actually looks like. But MIT did do a very solid study not too long ago that looked at the energy cost for various queries for various architectures. Text queries for very large GPT models actually had a higher energy cost than image gen using a normal number of iterations for Stable Diffusion models actually, which is pretty crazy. Anyhow, you're looking at per-query energy usage of like 15 seconds microwaving at full power to riding a bike a few blocks. When tallied over the immense number of queries being serviced, it does add up.
That all said, I think energy consumption is a silly thing to attack AI over. Modernize, modularize, and decentralize the grids and convert to non-GHG sources and it doesn't matter--there are other concerns with AI that are far more pressing (like deskilling effects and inability to control mis- and disinformation).
Basically every tech company is using it... It's millions of people, not just us...
Billions. Practically every Google search runs through Gemini now, and Google handles more search queries per day than there are humans on Earth.
Ew, who still uses Google Search?
Bitcoin or crypto?
What does it mean to consume water? Like it's used to cool something and then put back in a river? Or it evaporates? It's not like it can be used in some irrecoverable way right?
if they take the water and don't return to the source, there will be less available water in the water body, and it can lead to scarcity. If they take it and return, but at a higher temperature, or along with pollutants, it can impact the life in the water body. If they treat the water before returning, to be closest to the original properties, there will be little impact, but it means using more energy and resources for the treatment
"using" water tends to mean that it needs to be processed to be usable again. you "use" water by drinking it, or showering, or boiling pasta too.
It's using energy, we need more renewables. That's not a problem with AI. Direct your opprobrium where it belongs
I have started using Copilot more lately, but I’ve also switched from plastic straws to paper, so I’m good, right?
Why did you start using straws at all?
This is my main issue with it. I think its useful enough but only if it uses about the same energy as you would use doing whatever without it. Most conversations I had with someone trying to convince me it does not use to much power end up being very much like crypto ones were it keeps on being apples to oranges and the energy consumption seems to much. Im hoping hardware can be made to get the power use lower the way graphics cards did. I want to see querying an llm using about the same as searching for the answer or lower.
Generating bullshit that isn't really that useful.
Remember when the Apple Newton "revolutionized" computing with handwriting recognition?
No, of course not, because the whole thing sucked and vanished outside of old Doonesbury cartoons. LOL
My peer used the newton for comp sci class notes. Daily. Exclusively.
Then she went on to mastermind the behaviour and tactics of Myth: The Fallen Lords.
It's tenuous, but I say that's causal.