this post was submitted on 10 Feb 2025
531 points (95.2% liked)

Technology

76424 readers
3727 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Grandwolf319@sh.itjust.works 175 points 8 months ago (2 children)

Pretty sure Valve has already realized the correct way to be a tech monopoly is to provide a good user experience.

[–] finitebanjo@lemmy.world 6 points 8 months ago* (last edited 8 months ago) (1 children)

Idk, I kind of disagree with some of their updates at least in the UI department.

They treat customers well, though.

[–] b3an@lemmy.world 10 points 8 months ago* (last edited 8 months ago) (1 children)

Yeah. Steam and I are getting older. Would be nice to adjust simple things like text size in the tool.

Also that ‘Live’ shit bothers me. Live means live. Not ‘was recorded live, and now presented perpetually as LIVE’

load more comments (1 replies)
load more comments (1 replies)
[–] Semi_Hemi_Demigod@lemmy.world 86 points 8 months ago* (last edited 8 months ago) (1 children)

Time to dust off this old chestnut

[–] simplejack@lemmy.world 19 points 8 months ago (3 children)

I remember this being some sort of Apple meme at some point. Hence the gum drop iMac.

[–] diemartin@sh.itjust.works 4 points 8 months ago* (last edited 8 months ago) (1 children)

I think the inspiration is from Modern Humorist (note: there's no HTTPS)

Edit: here's the image in question:

load more comments (1 replies)
load more comments (2 replies)
[–] tonytins@pawb.social 80 points 8 months ago

Personally, I think Microsoft open sourcing .NET was the first clue open source won.

[–] meowmeowbeanz@sh.itjust.works 76 points 8 months ago (1 children)

Wall Street’s panic over DeepSeek is peak clown logic—like watching a room full of goldfish debate quantum physics. Closed ecosystems crumble because they’re built on the delusion that scarcity breeds value, while open source turns scarcity into oxygen. Every dollar spent hoarding GPUs for proprietary models is a dollar wasted on reinventing wheels that the community already gave away for free.

The Docker parallel is obvious to anyone who remembers when virtualization stopped being a luxury and became a utility. DeepSeek didn’t “disrupt” anything—it just reminded us that innovation isn’t about who owns the biggest sandbox, but who lets kids build castles without charging admission.

Governments and corporations keep playing chess with AI like it’s a Cold War relic, but the board’s already on fire. Open source isn’t a strategy—it’s gravity. You don’t negotiate with gravity. You adapt or splat.

Cheap reasoning models won’t kill demand for compute. They’ll turn AI into plumbing. And when’s the last time you heard someone argue over who owns the best pipe?

[–] Flocklesscrow@lemm.ee 17 points 8 months ago

Governments and corporations still use the same playbooks because they're still oversaturated with Boomers who haven't learned a lick since 1987.

[–] drahardja@lemmy.world 54 points 8 months ago* (last edited 8 months ago) (2 children)

DeepSeek shook the AI world because it’s cheaper, not because it’s open source.

And it’s not really open source either. Sure, the weights are open, but the training materials aren’t. Good luck looking at the weights and figuring things out.

[–] HK65@sopuli.xyz 18 points 8 months ago

I think it's both. OpenAI was valued at a certain point because of a perceived moat of training costs. The cheapness killed the myth, but open sourcing it was the coup de grace as they couldn't use the courts to put the genie back into the bottle.

[–] Hackworth@lemmy.world 14 points 8 months ago (1 children)

True, but they also released a paper that detailed their training methods. Is the paper sufficiently detailed such that others could reproduce those methods? Beats me.

[–] KingRandomGuy@lemmy.world 5 points 8 months ago

I would say that in comparison to the standards used for top ML conferences, the paper is relatively light on the details. But nonetheless some folks have been able to reimplement portions of their techniques.

ML in general has a reproducibility crisis. Lots of papers are extremely hard to reproduce, even if they're open source, since the optimization process is partly random (ordering of batches, augmentations, nondeterminism in GPUs etc.), and unfortunately even with seeding, the randomness is not guaranteed to be consistent across platforms.

[–] coherent_domain@infosec.pub 30 points 8 months ago (1 children)

I hate to disagree but IIRC deepseek is not a open-source model but open-weight?

[–] canadaduane@lemmy.ca 32 points 8 months ago* (last edited 8 months ago) (2 children)

It's tricky. There is code involved, and the code is open source. There is a neural net involved, and it is released as open weights. The part that is not available is the "input" that went into the training. This seems to be a common way in which models are released as both "open source" and "open weights", but you wouldn't necessarily be able to replicate the outcome with $5M or whatever it takes to train the foundation model, since you'd have to guess about what they used as their input training corpus.

[–] vrighter@discuss.tchncs.de 8 points 8 months ago (1 children)

I view it as the source code of the model is the training data. The code supplied is a bespoke compiler for it, which emits a binary blob (the weights). A compiler is written in code too, just like any other program. So what they released is the equivalent of the compiler's source code, and the binary blob that it output when fed the training data (source code) which they did NOT release.

load more comments (1 replies)
load more comments (1 replies)
[–] anzo@programming.dev 17 points 8 months ago

Not exactly sure of what "dominating" a market means, but the title is on a good point: innovation requires much more cooperation than competition. And the 'AI race' between nations is an antiquated mainframe pushed by media.

[–] vxx@lemmy.world 17 points 8 months ago* (last edited 8 months ago) (1 children)

Didnt it turn out that they used 10000 nvidia cards that had the 100er Chips, and the "low level success" and "low cost" is a lie?

[–] neons@lemmy.dbzer0.com 4 points 8 months ago

also they aren't actually open source? Only the weights are open source?

[–] AnimalsDream@slrpnk.net 7 points 8 months ago (2 children)

I'm not too informed about DeepSeek. Is it real open-source, or fake open-source?

[–] ifmu@lemmy.world 5 points 8 months ago (2 children)

It’s semi-open, not fully open source as what is typically thought of.

load more comments (2 replies)
[–] DasKapitalist@lemmy.ml 4 points 8 months ago (1 children)

Deepseek is the company, R1 is an MIT-licensed produce, they have the Qwen models under Apache license.

You can download, modify, run locally. There are many copies online out of Deepseek's control.

load more comments (1 replies)
load more comments
view more: next ›