this post was submitted on 24 Jan 2025
201 points (98.6% liked)

technology

23674 readers
415 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] redtea@lemmygrad.ml 3 points 2 months ago (1 children)

How big are the files for the finished model, do you know?

[–] KnilAdlez@hexbear.net 2 points 2 months ago* (last edited 2 months ago) (1 children)

That's a great question! The models come in different sizes, where one 'foundational' model is trained, and that is used to train smaller models. US companies generally do not release the foundational models (I think) but meta, Microsoft, deepseek, and a few others will release smaller ones available on ollama.com. A rule of thumb is that 1 billion parameters is about 1 gigabyte. The foundational models are hundreds of billions if not trillions of parameters, but you can get a good model that is 7-8 billion parameters, small enough to run on a gaming gpu.

[–] redtea@lemmygrad.ml 2 points 2 months ago