catty

joined 2 days ago
[–] catty@lemmy.world 1 points 19 hours ago (2 children)

That's only start up cost. What about ongoing 24/7 costs after 2 years?

[–] catty@lemmy.world 2 points 19 hours ago

ODroids don't meet European legal hazard levels on poisonous fumes. I bought one back in the day and they explained they won't apply for the test because of "the cost"... not that it uses cheap solder that don't meet lead limits.

[–] catty@lemmy.world 9 points 19 hours ago* (last edited 19 hours ago) (8 children)

I dislike posts like this. Technology moves quickly. PIs are great for hobby electronics where you need a little computer. Want a cheap computer to run a few things 24/7 and know what you're doing? Pi it is. You don't need to run containers on a pi because you have the skills to install the dependencies manually. They cost pennies to run 24/7.

I think of pis as beefed-up calculators. I have made lots of money using a pi zero running code I needed to run 24/7. Code I developed myself.

Having an old laptop with outdated parts taking up lots of space, weighing a lot, and having components like fans, keyboard, and mousepad most-likely soon dying and needing replacing is an additional concern you don't want.

Someone below saying use an old laptop if you're living with parents and don't pay the electricity bill is a bit lame. Do your part for the world. Someone will be paying for it.

Ultimately, use what you want but if you're just starting with servers, use a virtual machine on your computer and log in to it. You can dick about with it as much as you want, and reset back to a working state in seconds.

[–] catty@lemmy.world 1 points 22 hours ago* (last edited 22 hours ago) (1 children)

But won't this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?

[–] catty@lemmy.world 2 points 22 hours ago (1 children)

But its website is Chinese. Also what's the github?

[–] catty@lemmy.world 9 points 1 day ago

It's noise, junk to get attention away from other things.

[–] catty@lemmy.world 3 points 1 day ago (2 children)

This looks interesting - do you have experience of it? How reliable / efficient is it?

[–] catty@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Try the beta on the github repo, and use a smaller model!

[–] catty@lemmy.world 4 points 1 day ago

I'm getting very-near real-time on my old laptop. Maybe a delay of 1-2s whilst it creates the response

[–] catty@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

I agree. it looks nice, explains the models fairly well, hides away the model settings nicely, and even recommends some initial models to get started that have low requirements. I like the concept of plugins but haven't found a way to e.g. run python code it creates yet and display the output in the window

 

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

 

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›