Drunk & Root@sh.itjust.works to Selfhosted@lemmy.worldEnglish · 3 days agoHow to combat large amounts of Ai scrapersmessage-squaremessage-square54fedilinkarrow-up1116arrow-down14file-text
arrow-up1112arrow-down1message-squareHow to combat large amounts of Ai scrapersDrunk & Root@sh.itjust.works to Selfhosted@lemmy.worldEnglish · 3 days agomessage-square54fedilinkfile-text
everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions
minus-squareSheldan@lemmy.worldlinkfedilinkEnglisharrow-up2·22 hours agoSome of them are at least honest and have it as a user agent.
minus-squarekrakenfury@lemmy.sdf.orglinkfedilinkEnglisharrow-up2arrow-down1·21 hours agoIs ignoring robots.txt considered “honest”?
minus-squareSheldan@lemmy.worldlinkfedilinkEnglisharrow-up1·19 hours agoThat’s not what I was talking about
Some of them are at least honest and have it as a user agent.
Is ignoring robots.txt considered “honest”?
That’s not what I was talking about