this post was submitted on 28 Jul 2025
1252 points (99.5% liked)

Technology

73758 readers
3685 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] northendtrooper@lemmy.ca 22 points 1 week ago* (last edited 1 week ago) (2 children)

honestly there needs to be a git project that I can clone that uses a local LLM to auto call all of the senators, congressmen, reps, whoever. Also call all of the visa/mastercard callcenters non stop. And I mean non stop. Bonus points to use public figure voices. As in use AI to do our bidding.

[–] FauxLiving@lemmy.world 14 points 1 week ago

https://github.com/vndee/local-talking-llm

Seems like it would be a good place to start. You'd need to write the bit in order to send the output to a voip service and receive the input from the same service.

If you could get that going in a container you could spawn a bunch of them on VPSs (finding ones that have the hardware to run local AI would probably be expensive, probably better to use a hosting service if you're going to scale this).

I'm sure there are more conversation agent frameworks that people have built (it's a pretty simple loop to create), but if you wanted to get started this isn't bad.

[–] Maeve@kbin.earth -5 points 1 week ago

It's already here, so...