this post was submitted on 01 Feb 2024
28 points (91.2% liked)

Futurology

1812 readers
225 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] LesserAbe@lemmy.world 3 points 9 months ago (2 children)

How would someone go about running these things locally?

[–] paddirn@lemmy.world 4 points 9 months ago

LM Studio seems like the easiest option at this point.

[–] GBU_28@lemm.ee 2 points 9 months ago

Llama.cpp based on hardware.

DL from huggingface and run a command