Gotta make curl illegal now. Or why stop there? All Http clients! Nothing could go wrong 😊
Gotta make curl illegal now. Or why stop there? All Http clients! Nothing could go wrong 😊
Actually currently it contains this:
User-agent: *
Disallow: /
Well, that actually is a blanket ban for everyone, so something else must be at play here.
robots.txt, I guess? Yes, you can just ignore it, but you shouldn’t, if you develop a responsible web scraper.
I use AI often as a glorified search engine these days. It’s actually kinda convenient to give me ideas to look into further, when encountering a problem to solve. But would I just take some AI output without reviewing it? Hell no😄
My current company does and I hate it so much. Who even got that idea in the first place? Linux always dominated server-side stuff, no?
By now I’m up to filling one of these things. If they show me a second one, I’m out. Not wasting my time training some AI
When I last tried to let some AI write actual code, it didn’t even compile 🙂 And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.
So far I can only use AI as a glorified search engine 😅