this post was submitted on 25 Sep 2023
559 points (96.2% liked)

Technology

59578 readers
3092 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

you are viewing a single comment's thread
view the rest of the comments
[–] art@lemmy.world 125 points 1 year ago (6 children)

We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It'll eventually be very affordable.

[–] Soundhole@lemm.ee 51 points 1 year ago* (last edited 1 year ago) (4 children)

That's already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there's no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

Hell, you can even run llama.cpp on Android phones.

This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

[–] Zetta@mander.xyz 8 points 1 year ago* (last edited 1 year ago) (1 children)

Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

[–] Soundhole@lemm.ee 9 points 1 year ago (3 children)

Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

[–] Chreutz@lemmy.world 17 points 1 year ago

Never underestimate human ingenuity

When they're horny

[–] das@lemellem.dasonic.xyz 3 points 1 year ago (1 children)

And where would one look for these sexy sexy AI models, so I can avoid them, of course...

[–] Soundhole@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

Huggingface is where the models live. Anything that's uncensored (and preferably based on llama 2) should work.

Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

General rule: if you don't have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

[–] Zetta@mander.xyz 1 points 1 year ago (1 children)

Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

[–] Soundhole@lemm.ee 1 points 1 year ago* (last edited 1 year ago) (1 children)

Hey, I replied below to a different post with the same question, check it out.

[–] Zetta@mander.xyz 1 points 1 year ago (1 children)

Oh I see, sorry for the repeat question. Thanks!

[–] Soundhole@lemm.ee 1 points 1 year ago

lol nothing to be sorry about, I just wanted to make sure you saw it.

[–] MaxHardwood@lemmy.ca 5 points 1 year ago (1 children)

GPT4All is a neat way to run an AI chat bot on your local hardware.

[–] Soundhole@lemm.ee 2 points 1 year ago (1 children)

Thanks for this, I haven't tried GPT4All.

Oobabooga is also very popular and relatively easy to run, but it's not my first choice, personally.

[–] teuast@lemmy.ca 3 points 1 year ago

it does have a very funny name though

[–] scarabic@lemmy.world 1 points 1 year ago (2 children)

Don’t these models require rather a lot of storage?

[–] Soundhole@lemm.ee 1 points 1 year ago* (last edited 1 year ago) (1 children)

13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

It is relative so, I guess if you're comparing that to an atari 2600 cartridge then, yeah, it's hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

[–] scarabic@lemmy.world 1 points 1 year ago

Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

So okay, storage is not prohibitive.

[–] art@lemmy.world 1 points 1 year ago (1 children)

Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

[–] scarabic@lemmy.world 1 points 1 year ago

I’m just curious - do you know what kind of storage is required?

[–] teuast@lemmy.ca 1 points 1 year ago (1 children)

In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

You're probably right, but I kinda hope you're wrong.

[–] Soundhole@lemm.ee 1 points 1 year ago (1 children)
[–] teuast@lemmy.ca 3 points 1 year ago (1 children)

Call it paranoia if you want. Mainly I don't have faith in our economic system to deploy the technology in a way that doesn't eviscerate the working class.

[–] Soundhole@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

Oh, you are 100% justified in that! It's terrifying, actually.

But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI "brains" that are best for phones are still pretty stupid (for now).

Of course, living in our current Capitalist Hellscape, it's hard not to imagine that going awry to the point where we'll all 'rent' AI from some asshole who spies on everything we do, censors the AI for our own 'protection', or puts ads in there somehow. But I guess I'm a dreamer.

[–] pyldriver@lemmy.world 23 points 1 year ago (2 children)

God I wish, I would just love local voice control to turn my lights and such on and off... but noooooooooooo

[–] Otkaz@lemmy.world 37 points 1 year ago (1 children)
[–] pyldriver@lemmy.world 1 points 1 year ago

I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but.... I would like local only

[–] Kolanaki@yiffit.net -1 points 1 year ago* (last edited 1 year ago) (1 children)

I have that with just my phone, using Wiz lights and ITEEE. It's the only home automation I even have because it's the only one I found that doesn't necessarily need a special base station like an Alexa or Google Home.

[–] AA5B@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.

Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly

I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro

[–] Kolanaki@yiffit.net -1 points 1 year ago* (last edited 1 year ago) (2 children)

The base stations are what uses the cloud/AI shit. The setup I have doesn't even require an Internet connection or wifi; it's entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?

I don't want a piece of hardware that does nothing but act like a fucking middleman for no good reason.

[–] foggenbooty@lemmy.world 1 points 1 year ago

That is not necessarily true. Some base stations use the internet, yes, but not all. For example a Philips hue does not require internet access, nor does Lutron Caseta. As the other person posted, Home Assistant is the absolute best (IMO) way to do everything locally without the internet.

Your system, while it might work for you, does not scale well due to the limited range and reliability of Bluetooth. You'd likely be better off to adopt a more robust protocol like Z-wave, or ZigBee and get a hub that you have full control over.

[–] AA5B@lemmy.world 1 points 1 year ago (1 children)

I’m a huge fan of Home Assistant. You might look into it

[–] a1studmuffin@aussie.zone 12 points 1 year ago (1 children)

It's the year of the voice for Home Assistant. Given their current trajectory, I'm hopeful they'll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you're on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.

[–] AA5B@lemmy.world 4 points 1 year ago (1 children)

Thumbs up for Nabu Casa and Home Assistant!

I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing

[–] foggenbooty@lemmy.world 1 points 1 year ago

Get something with a little more power. Pi's are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I've got mine running ProxMox hosting all kinds of stuff.

I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It's got a shocking amount of processing in it.

[–] AA5B@lemmy.world 2 points 1 year ago (1 children)

While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services

[–] whofearsthenight@lemm.ee 1 points 1 year ago

Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn't really work as well on the other platform that I'll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don't work as well with Spotify.

But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it's a little less convenient because I'm really goddamned tired of hearing "by the way..."