• SerotoninSwells@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 hours ago

    Hi! I didn’t forget about your response. I sifted through the links to find the study in question. I imagine my response isn’t going to satisfy you but please hear me out. I’m open to hearing your rebuttals regarding this too.

    The study is absolutely correct with what they studied and the results they found. My main issues are the scope and some of the methodologies.

    On one hand, I see the “AI” they used was able to solve captchas better than humans. My main issue with this is that this is one tool. Daily, I work on dozens of different frameworks and services, some that claim to leverage AI. The results and ability to pass captcha varies with each tool. There’s an inevitable back and forth with each tool as these tools learn how to bypass us and as we counter these changes. There’s not just one tool that everyone is using as their bot as is the case in the study, so it’s not exactly how this works in the real world.

    I recognize that the list of sites they chose were the top 200 sites on the web. That said, there are more, up-and-coming captcha services that weren’t tested. I think it’s worth noting that the “captcha-less”, like Turnstile, approaches are still captcha but skip straight to proof of work and cutting out the human altogether.

    We should absolutely take studies like this to heart and find better ways that don’t piss off humans. But the reality is that these tools are working to cut down on a vast amount of bot traffic that you don’t see. I understand if you’re not ok with that line of reasoning because I’m asking you to trust me, a random Internet stranger. I imagine each company can show you metrics regarding FP rates and how many bots are actually passing their captcha. Most do their best to cut down on the false positive rate.

    • MonkderVierte@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      18 hours ago

      I mean, it’s a while since i worked in backend. But one of the base tools was to limit requests per second per IP, so they can’t DDOS you. If a bot crawls your webpage you host with the intention to share, what’s the harm? And if one particukar bot/crawler misbehaves, block it. And if you don’t intend to share, put it in a VPN.

      Is that out of date?

      • SerotoninSwells@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        Unfortunately it is out of date.

        • IPs used by bots are now *highly * distributed. We will see the same bot use hundreds of thousands of IP addresses. Each IP can easily only make one or two requests which is hard to limit with volume based detections. Also, I’m not sure where you’re at in the world, but it’s more common in countries outside of North America to have IP addresses that are heavily shared. Not to mention, there are companies in Europe that will pay you for use of your IP address explicitly for bots.

        • You might think you could limit by IP classification but bots increasingly use residential classified IPs.

        • As for allowing good bots, that isn’t so much an issue. They respect the robots.txt that companies implement. We see bots scraping data for LLMs more and more that don’t respect this file. Also, bots that are scraping prices and anything else you don’t want them doing, like credential stuffing, aren’t going to listen or respect that either.

        • In terms of using a VPN, absolutely limit outside access to sensitive infrastructure but that’s not really where most companies experience pain from bots. That’s not to say that we don’t see bots attempting vulnerability scanning. These requests can be highly distributed too.

        Companies ultimately reach out to companies like Cloudflare because the usual methods aren’t working for them. Onboarding some clients, I’ve seen more bot requests than human requests which can be detrimental for business.

        I’m happy to answer any other questions you might have. While I do work in the industry, I don’t know everything. I just want to reiterate that I am not a fan of how things are currently on the Internet. I wish this was illegal as I think it would cut down on a lot of bot traffic which would make it much more manageable for everyone.