this post was submitted on 22 Jan 2025
111 points (90.5% liked)

Not The Onion

12932 readers
809 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

Couldn't make this shit up if I tried.

top 27 comments
sorted by: hot top controversial new old
[–] brucethemoose@lemmy.world 66 points 4 days ago* (last edited 4 days ago) (1 children)

In case anyone missed it, Deepseek just released models that make OpenAI's best nearly irrelevent... in the open, for anyone to host. For a tiny fraction of the hosting cost.

Even the small distillation that fits on a 24GB VRAM desktop is incredible. And you can host it for others to use for free, with room for batching, like I'm doing right now. And there is so much that's awesome about it, like the SFT training pipeline/code being published and the smaller models being built on top of models from another company (Qwen 2.5).

I don't even care what he's saying now, but don't believe a word that comes out of Altman's mouth. He's a just as much of a greedy con man as Musk, trying to gaslight everyone into thinking OpenAI will be relevant in a year, not a hollow, closed shell that sold out its research directive for cheap short term profit.

[–] unmagical@lemmy.ml 16 points 4 days ago (1 children)

24GB VRAM desktop

That's minimum a $1000 GPU if you go red team or $1500 for green.

[–] brucethemoose@lemmy.world 8 points 4 days ago* (last edited 4 days ago) (1 children)

Dual 3060s are an option. LLMs can be split across GPUs reasonably well.

3090s used to be like $700 used, but ironically they've gone up in price. I got mine for around $800 awhile ago, and stuffed it into 10L PC.

Some people buy used P40s. There are rumors of a 24GB Arc B580. Also, AMD Strix Halo APU laptops/mini PCs can host it quite well, with the right software setup... I might buy an ITX board if anyone ever makes one.

Also, there are 12GB/6GB VRAM distillations too, but 24GB is a huge intelligence step-up.

[–] unmagical@lemmy.ml 5 points 4 days ago (1 children)

Totally forgot the 3090 had 24GB. It's definitely still enthusiast territory though.

[–] brucethemoose@lemmy.world 3 points 4 days ago

For sure.

The 14B distillation is still quite good, and usable on like 10GB GPUs. Maybe 8 with the right settings.

[–] Quill7513@slrpnk.net 47 points 4 days ago

WHO CARES!? WHO ACTUALLY GIVES A SHIT!? WHO THE FUCK IS LETTING THIS FUCKING 400 BILLION DOLLAR BURDEN ON THE WORLD DOMINAHE NEWS CYCLES FOR ANYTHING OTHER THAN "this guy fucking sucks" WHAT THE FUCK ARE WE ALL DOING!?

[–] OpenStars@piefed.social 38 points 4 days ago (1 children)
[–] L0rdMathias@sh.itjust.works 34 points 4 days ago (1 children)

And not a single mention of our Lord and Savior Dr. Daniel Jackson.

[–] cm0002@lemmy.world 5 points 4 days ago (1 children)

Well he's kinda busy on the higher plane of existence, or is he back again....or did he come back and then go back again‽

Where in the ~~world~~ planes of existence is Dr Daniel Jackson‽‽

[–] 667@lemmy.radio 2 points 4 days ago

Subtly influencing events to improve the outcome of the Deadelus in their bid to leap between galaxies.

[–] NONE_dc@lemmy.world 22 points 4 days ago (1 children)

I thought it was about the tv show...

[–] HikingVet@lemmy.ca 12 points 4 days ago (1 children)
[–] NONE_dc@lemmy.world 13 points 4 days ago (1 children)

Once again, AI deceived and disappointed us...

[–] devfuuu@lemmy.world 2 points 4 days ago (1 children)

look, at this point, we may actually be able to create our own sequels with ai... as long as the replicators are kept far away from it.

[–] HikingVet@lemmy.ca 2 points 4 days ago

Yeah, how about we don't teach ai about replicators.

[–] Omgboom@lemmy.zip 12 points 4 days ago (4 children)

Stargate Atlantis is the best Stargate series fight me.

[–] devfuuu@lemmy.world 5 points 4 days ago

It surely had some amazing moments.

[–] llii@discuss.tchncs.de 2 points 4 days ago

Dr. McKay is best.

[–] treadful@lemmy.zip 4 points 4 days ago (1 children)

The Wraith were cringe though and you know it.

[–] Metz@lemmy.world 4 points 4 days ago

Todd was cool though.

[–] Tar_alcaran@sh.itjust.works 4 points 4 days ago (1 children)

Someone give this poster control of a government agency! They're super extra correct.

[–] OpenStars@piefed.social 3 points 4 days ago

No way, they used facts rather than suck-up, so they're fired already.

[–] blindbunny@lemmy.ml 5 points 4 days ago* (last edited 4 days ago)

Distraction

[–] heavydust@sh.itjust.works 3 points 4 days ago

Idiot's fight, for the next 4 years. I'm tired already.

[–] possiblylinux127@lemmy.zip 1 points 4 days ago (1 children)

Can we get a boxing match? That would be fun to watch

[–] ProfessorProteus@lemmy.world 1 points 4 days ago

I'd be fine if they brought back old-fashioned duels. Besides, boxing requires a level of athleticism. I don't really know what altman looks like (and I couldn't give less of a shit) but musk is a disgusting, pale cold ham. Seen those boat photos? 🤮