this post was submitted on 13 Apr 2024
488 points (96.4% liked)

Technology

59666 readers
2723 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hector@sh.itjust.works 6 points 7 months ago (2 children)

Wow! 13GB! I did some heavy stuff on my computer with like a shit ton of Docker servers running together + deployment and I never reached 13GB!

Without disclosing private company information lol what are you doing ;)

[–] ben_dover@lemmy.ml 4 points 7 months ago* (last edited 7 months ago) (2 children)

not OP, but I have to run fronted and backend of a project in docker simultaneously (multiple postgres and redis dbs, queues, search index, etc., plus two webservers), plus a few browser tabs and two VSCode instances open, regularly pushes my machine over 15gb ram usage

pretty much like this

[–] Veraxus@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

That is basically my use-case. You add a DB service (or two), DNS, reverse proxy, Redis, Memcached, etc... maybe some containers for additional proprietary backend services like APIs, and then the application themselves that need those things to run... it adds up FAST. The advantage is that you can have multiple projects all running simultaneously and you can add/remove/swap them pretty easily.

RAM is cheap. There is no excuse for shipping a 8GB computer... even if it's mostly going to be used for family photos and internet.

[–] Veraxus@lemmy.world 2 points 7 months ago

Running a suite of services in containers (DBs, DNS, reverse proxy, memcached, redis, elasticsearch, shared services, etc) plus a number of discreet applications that use all those things. My day-to-day usage hovers around 20GB with spikes to 32 (my max allocation) when I run parallelized test suites.

Dockers memory usage really adds up fast.