this post was submitted on 21 Aug 2025
219 points (89.5% liked)

Selfhosted

51408 readers
763 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Some thoughts on how useful Anubis really is. Combined with comments I read elsewhere about scrapers starting to solve the challenges, I'm afraid Anubis will be outdated soon and we need something else.

you are viewing a single comment's thread
view the rest of the comments
[–] Dremor@lemmy.world 21 points 3 weeks ago (10 children)

Anubis is no challenge like a captcha. Anubis is a ressource waster, forcing crawler to resolve a crypto challenge (basically like mining bitcoin) before being allowed in. That how it defends so well against bots, as they do not want to waste their resources on needless computing, they just cancel the page loading before it even happen, and go crawl elsewhere.

[–] tofu@lemmy.nocturnal.garden 7 points 3 weeks ago (9 children)

No, it works because the scraper bots don't have it implemented yet. Of course the companies would rather not spend additional compute resources, but their pockets are deep and some already adapted and solve the challenges.

[–] EncryptKeeper@lemmy.world 12 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

The point was never that Anubis challenges are something scrapers can’t get past. The point is it’s expensive to do so.

Some bots don’t use JavaScript and can’t solve the challenges and so they’d be blocked, but there was never any point in time where no scrapes could solve them.

[–] JuxtaposedJaguar@lemmy.ml -2 points 3 weeks ago (1 children)

Wait, so browsers that disable JavaScript won't be able to access those websites? Then I hate it.

Not everyone wants unauthenticated RCE from thousands of servers around the world.

[–] EncryptKeeper@lemmy.world 7 points 3 weeks ago

Not everyone wants unauthenticated RCE from thousands of servers around the world.

Ive got really bad news for you my friend

load more comments (6 replies)
load more comments (6 replies)