this post was submitted on 08 Jun 2025
169 points (97.7% liked)

Fuck AI

3031 readers
942 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

It's impossible, i got this instance to just see lemmy from my own instance, but no, it was slow as hell the whole week, i got new pods, put postgres on a different pod, pictrs on another, etc.

But it was slow as hell. I didn't know what it was until a few hours before now. 500 GETs in a MINUTE by ClaudeBot and GPTBot, wth is this? why? I blocked the user agents, etc, using a blocking extension on NGINX and now it works.

WHY? So google can say that you should eat glass?

Life is now hell, if before at least someone could upload a website, now even that is painfull.

Sorry for the rant.

top 28 comments
sorted by: hot top controversial new old
[โ€“] MonkderVierte@lemmy.zip 9 points 7 hours ago (1 children)

Patience, AI crash bubble burst will be soon.

[โ€“] AstralPath@lemmy.ca 3 points 5 hours ago
[โ€“] jagged_circle@feddit.nl 4 points 7 hours ago* (last edited 3 hours ago) (1 children)

Just cache. Read only traffic should add negligible load to your server. Or you're doing something horribly wrong

[โ€“] potatoguy@potato-guy.space 3 points 7 hours ago (2 children)

They are 1 cpu and 1 gb of ram pods, postgres goes to 100% cpu on 500 requests per minute, after i put the NGINX extension, it reduced to at max 10%. On weaker servers, these bots make hell on earth, not the config.

[โ€“] jagged_circle@feddit.nl 1 points 3 hours ago

Load should be near zero for reads.

[โ€“] jerkface@lemmy.ca 5 points 6 hours ago (1 children)

If it's hitting postgres it's not hitting the cache. Do you have a caching reverse proxy in front of your web application?

[โ€“] potatoguy@potato-guy.space 1 points 5 hours ago* (last edited 5 hours ago) (1 children)

I don't have a cache, but the problem is solved now, i can browse lemmy haha.

[โ€“] jerkface@lemmy.ca 4 points 5 hours ago (1 children)

The nginx instance you have in front of your app can perform caching and avoid hitting your app. The advantage is that it will improve performance even against the most stealthy of bots, including those that don't even exist yet. The disadvantage is that the AI scum get what they want.

[โ€“] potatoguy@potato-guy.space 1 points 5 hours ago (1 children)

Oh, cool. I'm going to look at it!

[โ€“] jagged_circle@feddit.nl 1 points 3 hours ago

If that doesn't work for you, also look at varnish and squid.

[โ€“] Mwa@thelemmy.club 11 points 11 hours ago* (last edited 11 hours ago) (1 children)

You can either use Cloudflare(proprietary) or anubis (Foss)

[โ€“] jagged_circle@feddit.nl 0 points 7 hours ago (1 children)
[โ€“] Mwa@thelemmy.club 5 points 7 hours ago (1 children)
[โ€“] jagged_circle@feddit.nl 1 points 3 hours ago

Because it harms marginalized folks' ability to access content while also letting evil corp (and their fascist government) view (and modify) all encrypted communication with your site and its users.

It's bad.

[โ€“] flamingos@feddit.uk 78 points 18 hours ago* (last edited 17 hours ago) (2 children)

You can enable Private Instance in your admin settings, this will mean only logged in users can see content. This will prevent AI scrapers from slowing down your instance as all they'll see is an empty homepage, so no DB calls. As long as you're on 0.19.11, federation will still work.

[โ€“] melroy@kbin.melroy.org 11 points 12 hours ago

Same for Mbin.

[โ€“] potatoguy@potato-guy.space 31 points 17 hours ago

Enabled, thanks for the tip!

[โ€“] lena@gregtech.eu 8 points 12 hours ago

Cloudflare has pretty good protection against this, but I totally understand not wanting to use Cloudflare

[โ€“] termaxima@programming.dev 19 points 15 hours ago (1 children)

Anubis + Nepenthes is the answer.

[โ€“] Finch9678@europe.pub 13 points 12 hours ago

Article for whoever was unaware like me.

[โ€“] melroy@kbin.melroy.org 7 points 12 hours ago

Haha, just wait when you get ddosed by anonymous user agents. I have been there.

I'm talking 40k requests per 5 seconds.

[โ€“] xep@fedia.io 46 points 19 hours ago (1 children)

At some point they're going to try to evade detection to continue scraping the web. The cat and mouse game continues except now the "pirates" are big tech.

[โ€“] brandon@piefed.social 31 points 19 hours ago* (last edited 19 hours ago) (2 children)

They already do. ("They" meaning AI generally, I don't know about Claude or ChatGPT's bots specifically). There are a number of tools server admins can use to help deal with this.

See also:

https://zadzmo.org/ is dead already and arstechnica is writing about them so...

[โ€“] lurch@sh.itjust.works 12 points 16 hours ago

these solutions have the side effect of making the bots stay on your site longer and generate more traffic. it's not for everyone.

[โ€“] parpol@programming.dev 36 points 19 hours ago (1 children)

Use Anubis. That's pretty much the only thing you can do against bots that they have no way of circumventing.

[โ€“] potatoguy@potato-guy.space 15 points 19 hours ago (1 children)

Yeah, going to install it this week, but the nginx extension seemed to solve the issue.

[โ€“] melroy@kbin.melroy.org 4 points 12 hours ago

Which extention are you using if I may ask?