this post was submitted on 05 May 2025
4 points (100.0% liked)

Programming

13589 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Hello! Basically, I need to process a very large (4000 lines) file and free ai chatbots like chatgpt aren't able to handle it. I would like to split it into smaller parts and process each part separately. I'm having however a very hard time finding a chatbot with free API. the only one I found is huggingchat, but after a few requests waiting 1 seconds before sending the next one it starts giving rate limit errors.

any suggestion? thanks in advance!

EDIT: I also tried to run gpt4all on my laptop (with integrated graphics) and it took like 2-5 minutes to asnwer a simple "hello" prompt, so it's not really feasable :(

you are viewing a single comment's thread
view the rest of the comments
[–] BakedCatboy@lemmy.ml 2 points 4 days ago

What model size did you run on your laptop? I have an Intel Nuc with an i7 and I run various models on CPU (it doesn't have a dedicated GPU) and while I can't run stuff larger than ~14b or so, models up to around ~7b aren't too slow. If I try to run a 32b then I get a similar experience to you. I tend not to go below 4b because that's when it starts being dumb and not following instructions well, so just depends on how complex your task is.