Anubis: Protecting Websites from Aggressive Scraping
AI Summary
To combat the relentless scraping by AI companies that often leads to website downtime, we've implemented Anubis, a protective measure. Anubis employs a Proof-of-Work system inspired by Hashcash, traditionally used to mitigate email spam. This system is designed to be negligible for individual users but becomes costly for mass scrapers, thereby deterring them. The ultimate goal is to refine this approach by focusing on identifying headless browsers through unique characteristics like font rendering, reducing the need for Proof-of-Work challenges for legitimate users. It's important to note that Anubis requires modern JavaScript features, which may be disabled by plugins like JShelter. Users should disable such plugins to access the site smoothly. Currently, the website is using Anubis version 1.21.3.
Key Concepts
Proof-of-Work is a consensus mechanism originally designed to deter spam and denial-of-service attacks by requiring a computational effort from the service requester. It's a way to ensure that a certain amount of work is done before a service is granted, making it costly to perform malicious activities at scale.
Web scraping is the process of automatically extracting large amounts of data from websites. It is often used for data analysis, price comparison, and other automated tasks but can lead to server overload and downtime if done excessively.
Category
TechnologyOriginal source
https://www.wesnoth.orgMore on Discover
Summarized by Mente
Save any article, video, or tweet. AI summarizes it, finds connections, and creates your to-do list.
Start free, no credit card