this post was submitted on 24 Jun 2024
730 points (95.5% liked)

Technology

59597 readers
3259 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple's claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won't be able to use it. There's a memory requirement for Predictive Code Completion in Xcode 16, and it's the closest thing we'll get from Apple to an admission that 8GB of memory isn't really enough for a new Mac in 2024.

you are viewing a single comment's thread
view the rest of the comments
[–] rottingleaf@lemmy.zip 6 points 5 months ago (2 children)

256MB or 512MB was fine for high-quality content in 2002, what was that then.

Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

But 4GB being not enough? Do you realize what 4GB is?

[–] lastweakness@lemmy.world 8 points 5 months ago (1 children)

They didn't just quadruple. They're orders of magnitude higher these days. So content is a real thing.

But that's not what's actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

[–] rottingleaf@lemmy.zip -1 points 5 months ago

I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

[–] Aux@lemmy.world 1 points 5 months ago (1 children)

One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there's more! Today we're using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

[–] rottingleaf@lemmy.zip -2 points 5 months ago* (last edited 5 months ago) (1 children)

Yes, you wouldn't have 4K in 2002.

4GB today is nothing.

My normal usage would be kinda strained with it, but possible.

$ free -h
               total        used        free      shared  buff/cache   available
Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
Swap:          2,0Gi          0B       2,0Gi
$ 
[–] Aux@lemmy.world -1 points 5 months ago (1 children)

I can do a cold boot and show you empty RAM as well. So fucking what?

[–] rottingleaf@lemmy.zip 3 points 5 months ago

It's not a cold boot and it's not empty.