I would love to think so. But the word "verified" suggests more.
lily33
That makes me think, perhaps, you might be able to set it to exec("stuff") or True
...
while allowing legitimate users and verified crawlers to browse normally.
What is a "verified crawler" though? What I worry about is, is it only big companies like Google that are allowed to have them now?
I agree that it's difficult to enforce such a requirement on individuals. That said, I don't agree that nobody cares for the content they post. If they have "something cool they made with AI generation" - then it's not a big deal to have to mark it as AI-generated.
Файнали, уи дон'т хев ту уори абаут спелинг энимор, иц ол ритан хау иц рэд.
An intelligence service monitors social media. They may as well have said, "The sky is blue."
More interesting is,
Sharing as a force multiplier
-- OpenAI
Do you know of a provider is actually private? The few privacy policies I checked all had something like "We might keep some of your data for some time for anti-abuse or other reasons"...
Too bad that's based on macros. A full preprocessor could require that all keywords and names in each scope form a prefix code, and then allow us to freely concatenate them.
Aren't USAid grants public?
Yes, OpenAI wishes everyone else has to have authorization to do model training...
Fortunately, their ToS don't matter all that much, it's easy to use their model through a third party without ever touching them.
Any accessibility service will also see the "hidden links", and while a blind person with a screen reader will notice if they wonder off into generated pages, it will waste their time too. Especially if they don't know about such "feature" they'll be very confused.
Also, I don't know about you, but I absolutely have a use for crawling X, Google maps, Reddit, YouTube, and getting information from there without interacting with the service myself.