this post was submitted on 19 Jan 2024
198 points (95.8% liked)

Technology

59641 readers
4214 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Oha@lemmy.ohaa.xyz 112 points 10 months ago (9 children)

what the fuck is a "AI PC"?

[–] Sabata11792@kbin.social 64 points 10 months ago (2 children)

They get to put a sticker on that inflates the value by $600, then fill it with spyware.

[–] assassinatedbyCIA@lemmy.world 23 points 10 months ago (1 children)

‘CApItALiSm BreEdS INnoVAtION’

load more comments (1 replies)
[–] Oha@lemmy.ohaa.xyz 12 points 10 months ago (1 children)
[–] Sabata11792@kbin.social 10 points 10 months ago

Found the shareholder.

[–] LodeMike@lemmy.today 55 points 10 months ago (1 children)

It means “VC money now 🥺🥺”

[–] Deceptichum@kbin.social 15 points 10 months ago (1 children)

Microsoft is chasing VC money now?

[–] maynarkh@feddit.nl 16 points 10 months ago

Not VC, more like hedge funds and institutional investors. But yes, all public companies work primarily for higher share prices, and then everything else. I've experienced a public US company paying more than a million USD to save 300k just so they can put out good articles about themselves that they kept promises to shareholders.

[–] LemmyIsFantastic@lemmy.world 23 points 10 months ago

Branding. It's just saying it's capable of handling local models on copilot.

[–] Potatos_are_not_friends@lemmy.world 15 points 10 months ago

If I have to deal with Blockchain cloud computing IoT bullshit as a software engineer, I want everyone else to feel my buzzword pain in the tech they use.

[–] fidodo@lemmy.world 9 points 10 months ago

I guess a PC with a graphics card?

[–] cholesterol@lemmy.world 9 points 10 months ago

The new 'VR Ready'

load more comments (3 replies)
[–] maynarkh@feddit.nl 80 points 10 months ago (1 children)

Thus, Windows will again be instrumental in driving growth for the minimum memory capacity acceptable in new PCs.

I love that the primary driver towards more powerful hardware is Windows just bloating itself bigger and bigger. It's a grift in its own way, consumers are subsidizing the requirements for Microsoft's idiotic data processing. And MSFT is not alone in this, Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).

[–] thesorehead@lemmy.world 9 points 10 months ago (1 children)

Google doing away with cookies also conveniently shifts away most ad processing from their servers into Chrome (while killing their competition).

OOTL, what's going on here? Distributed processing like Folding@Home, but for serving ads to make Google more money?

[–] maynarkh@feddit.nl 16 points 10 months ago

They called it Federated Learning of Cohorts at one point. Instead of you sending raw activity data to Google servers and them running their models there, the model runs in Chrome and they only send back the ad targeting groups you belong to. All in the name of privacy of course.

[–] thecrotch@sh.itjust.works 44 points 10 months ago* (last edited 10 months ago) (1 children)

Microsoft is desperate to regain the power they had in the 00s and is scrambling trying to find that killer app. At least this time they're not just copying apples homework.

[–] Toribor@corndog.social 7 points 10 months ago* (last edited 10 months ago) (1 children)

They either force it on everyone or bundle it in the enterprise package that businesses already pay for and then raise the price.

It never works, but maybe this time it will. I mean it won't... But maybe.

[–] tias@discuss.tchncs.de 8 points 10 months ago* (last edited 10 months ago)

And maybe that's why it isn't working. They try too hard to persuade or force you, giving people icky feelings from the get go... and they try too little to just make a product that people want.

[–] cmnybo@discuss.tchncs.de 43 points 10 months ago (1 children)

At least it should result in less laptops being made with ridiculously small amounts of non upgradable RAM.

Requiring a large amount of compute power for AI is just stupid though. It will probably come in the form of some sort of dedicated AI accelerator that's not usable for general purpose computing.

[–] throws_lemy@lemmy.nz 22 points 10 months ago (1 children)

And remember that your data and telemetry are sent to Microsoft servers to train Copilot AI. You may also need to subscribe to some advanced AI features

[–] DontMakeMoreBabies@kbin.social 8 points 10 months ago (2 children)

And that's when I'll start using Linux as my daily driver.

Honestly installing Ubuntu is almost idiot proof at this point.

[–] throws_lemy@lemmy.nz 5 points 10 months ago* (last edited 10 months ago) (2 children)

I do agree with you, the obstacle is that there are many applications that are not available on Linux or they're not as powerful as on Windows. As for me is MS. Excel, many of my office clients use VBA in Excel spreadsheet to do calculations.

[–] Reptorian@lemmy.zip 4 points 10 months ago* (last edited 10 months ago)

At least we might have a finally viable replacement in Photoshop soon. GIMP is getting NDE, Krita might be getting foreground extraction tool at some point, and Pixellator might have better tools though it's NDE department is solid. The thing is all of them are missing something, but I'm betting on GIMP after CMYK_Student arrival to GIMP development.

I tried adding foreground selection based on guided selection, but was unable to fix noises on in-between selection and was unable to build Krita. We would have Krita with foreground selection if it weren't for that.

load more comments (1 replies)
[–] frankpsy@lemm.ee 32 points 10 months ago (1 children)

Apple: what's wrong with just 8GB RAM?

[–] douglasg14b@lemmy.world 33 points 10 months ago (8 children)

Yeah, and solder it onto the board while you're at it! Who ever needs to upgrade or perform maintenance anyways?

load more comments (8 replies)
[–] DumbAceDragon@sh.itjust.works 31 points 10 months ago (2 children)

"Wanna see me fill entire landfills with e-waste due to bullshit minimum requirements?"

"Wanna see me do it again?"

[–] archomrade@midwest.social 7 points 10 months ago

All I can think of:

Hi kids, do you like violence? Wanna see me stick nine-inch nails through each one of my eyelids? Wanna copy me and do exactly like I did? Try 'cid and get fucked up worse than my life is?

load more comments (1 replies)
[–] furzegulo@lemmy.dbzer0.com 21 points 10 months ago (2 children)

no ai ain't gonna come into my pc

[–] Shurimal@kbin.social 15 points 10 months ago

Unless it's locally hosted, doesn't scan every single file on my storage and doesn't send everything I do with it to the manufacturer's server.

[–] Secret300@sh.itjust.works 6 points 10 months ago (1 children)

Personally I really want it to but only locally run AI like lamma or whatever it's called

[–] evranch@lemmy.ca 4 points 10 months ago (3 children)

Do it, it's easy and fun and you'll learn about the actual capabilities of the tech. Started a week ago and I'm a convert on the utility of local AI. Got to go back to Reddit for it but r/localllama has tons of good info. You can actually run useful models at a conversational pace.

This whole thread is silly because VRAM is what you need, I'm running some pretty good coding and general knowledge models in a 12GB Radeon. Almost none of my 32GB system ram is used lol either Microsoft is out of touch or hiding an amazing new algorithm

Running in system ram works but the processing is painfully slow on the regular CPU, over 10x slower

load more comments (3 replies)
[–] nyakojiru@lemmy.dbzer0.com 18 points 10 months ago* (last edited 10 months ago)

They are making for a long time now, a massive slow effort to make end users finally migrate to Linux (and I’m a whole life windows guy)

[–] SomeGuy69@lemmy.world 16 points 10 months ago

Low amount of ram becomes the AI detox mechanism of this century.

[–] HidingCat@kbin.social 14 points 10 months ago (1 children)

Great, so it'll take AI to set 16GB as minimum.

I still shudder that there are machines still being sold with 8GB RAM, that's just barely enough.

load more comments (1 replies)
[–] query@lemmy.world 13 points 10 months ago (2 children)

AI PC sounds like something that will be artificially personal more than anything else.

load more comments (2 replies)
[–] banneryear1868@lemmy.world 11 points 10 months ago

Makes sense, 16GB is sort of the new "normal" although 8GB is still quite enough for everyday casual use. "AI PCs" being a marketing term just like "AI" itself.

[–] Poem_for_your_sprog@lemmy.world 9 points 10 months ago (3 children)

Opening excel and outlook on a win11 PC brings you to almost 16GB of memory used. I don't know how anybody is still selling computers with 8GB of ram.

[–] NoRodent@lemmy.world 9 points 10 months ago (1 children)

That doesn't work even as a hyperbole. I literally just opened an Excel spreadsheet with 51192 rows (I had Outlook already open) and those two programs still only take 417 MB of RAM combined. Meanwhile Firefox is at 2.5 GB. Yes, my total RAM currently used is 13.8 GB but I have 64 GB of RAM installed and you should know that generally the more RAM you have, the more of it gets utilized by the system (this is true for all modern OS, not just Windows) which is a good thing, because it means better performance, since you can cache more things in RAM that would otherwise needed to be read from disk. Unused RAM is wasted RAM. So even if one computer uses 16 GB of RAM for some relatively simple tasks, it doesn't necessarily mean it wouldn't run or grind to a halt on a system with less RAM.

load more comments (1 replies)
[–] rbesfe@lemmy.ca 6 points 10 months ago

Uh... No, it doesn't. 8GB is definitely tight these days, but for simple word processing, email, and spreadsheet usage it still works fine.

[–] Liz@midwest.social 5 points 10 months ago (2 children)

Why in the hell do those programs take up so much space?

load more comments (2 replies)
[–] irdc@derp.foo 5 points 10 months ago (1 children)

Ah good. Now I know what specs not to buy.

[–] LemmyIsFantastic@lemmy.world 7 points 10 months ago

You have fun sticking to ms running your 8gb of ram, that'll show em!

load more comments
view more: next ›