Andy Reid@lemmy.world to Technology@lemmy.worldEnglish · 9 months agoAI companies are violating a basic social contract of the web and and ignoring robots.txtwww.theverge.comexternal-linkmessage-square26fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkAI companies are violating a basic social contract of the web and and ignoring robots.txtwww.theverge.comAndy Reid@lemmy.world to Technology@lemmy.worldEnglish · 9 months agomessage-square26fedilink
minus-squareAatube@kbin.sociallinkfedilinkarrow-up0·9 months agorobots.txt is purely textual; you can’t run JavaScript or log anything. Plus, one who doesn’t intend to follow robots.txt wouldn’t query it.
minus-squareBrianTheeBiscuiteer@lemmy.worldlinkfedilinkEnglisharrow-up1·9 months agoIf it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like: here-there-be-dragons.html Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.
minus-square4am@lemm.eelinkfedilinkEnglisharrow-up1·9 months agoserver { name herebedragons.example.com; root /dev/random; }
minus-squarePlexSheep@feddit.delinkfedilinkEnglisharrow-up1·9 months agoNice idea! Better use /dev/urandom through, as that is non blocking. See here.
minus-squaregravitas_deficiency@sh.itjust.workslinkfedilinkEnglisharrow-up0arrow-down1·9 months agoI actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.
robots.txt is purely textual; you can’t run JavaScript or log anything. Plus, one who doesn’t intend to follow robots.txt wouldn’t query it.
If it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like:
Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.
server {
name herebedragons.example.com; root /dev/random;
}
Nice idea! Better use
/dev/urandom
through, as that is non blocking. See here.I actually love the data-poisoning approach. I think that sort of strategy is going to be an unfortunately necessary part of the future of the web.