this post was submitted on 23 Feb 2025
38 points (97.5% liked)

Selfhosted

42718 readers
1425 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Long story short, my VPS, which I'm forwarding my servers through Tailscale to, got hammered by thousands of requests per minute from Anthropic's Claude AI. All of which being from different AWS IPs.

The VPS has a 1TB monthly cap, but it's still kinda shitty to have huge spikes like the 13GB in just a couple of minutes today.

How do you deal with something like this?
I'm only really running a caddy reverse proxy on the VPS which forwards my home server's services through Tailscale. "

I'd really like to avoid solutions like Cloudflare, since they f over CGNAT users very frequently and all that. Don't think a WAF would help with this at all(?), but rate limiting on the reverse proxy might work.

(VPS has fail2ban and I'm using /etc/hosts.deny for manual blocking. There's a WIP website on my root domain with robots.txt that should be denying AWS bots as well...)

I'm still learning and would really appreciate any suggestions.

you are viewing a single comment's thread
view the rest of the comments
[–] sailorzoop@lemmy.librebun.com 1 points 8 hours ago* (last edited 8 hours ago) (1 children)

In my experience, git forges are especially hit hard

Is that why my Forgejo instance has been hit twice like crazy before...
Why can't we have nice things. Thank you!

EDIT: Hopefully Photon doesn't get in their sights as well. Though after using the official lemmy webui for a while, I do really like it a lot.

[–] poVoq@slrpnk.net 1 points 8 hours ago

Yeah, Forgejo and Gitea. I think it is partially a problem of insufficient caching on the side of these git forges that makes it especially bad, but in the end that is victim blaming 🫠

Mlmym seems to be the target because it is mostly Javascript free and therefore easier to scrape I think. But the other Lemmy frontends are also not well protected. Lemmy-ui doesn't even allow to easily add a custom robots.txt, you have to manually overwrite it in the reverse-proxy.