They used AI to destroy AI
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
This is getting ridiculous. Can someone please ban AI? Or at least regulate it somehow?
As for everything, it has good things, and bad things. We need to be careful and use it in a proper way, and the same thing applies to the ones creating this technology
The problem is, how? I can set it up on my own computer using open source models and some of my own code. It’s really rough to regulate that.
Once a technology or even an idea is there, you can't really make it go away - ai is here to stay. The generative LLM are just a small part.
Jokes on them. I'm going to use AI to estimate the value of content, and now I'll get the kind of content I want, though fake, that they will have to generate.
Should have called it "Black ICE".
Definitely falls under the category of a Trap ICE card.
I have no idea why the makers of LLM crawlers think it's a good idea to ignore bot rules. The rules are there for a reason and the reasons are often more complex than "well, we just don't want you to do that". They're usually more like "why would you even do that?"
Ultimately you have to trust what the site owners say. The reason why, say, your favourite search engine returns the relevant Wikipedia pages and not bazillion random old page revisions from ages ago is that Wikipedia said "please crawl the most recent versions using canonical page names, and do not follow the links to the technical pages (including history)". Again: Why would anyone index those?
They want everything, does it exist, but it's not in their dataset? Then they want it.
They want their ai to answer any question you could possibly ask it. Filtering out what is and isn't useful doesn't achieve that
Because it takes work to obey the rules, and you get less data for it. The theoretical competitor could get more ignoring those and get some vague advantage for it.
I'd not be surprised if the crawlers they used were bare-basic utilities set up to just grab everything without worrying about rules and the like.
Because you are coming from the perspective of a reasonable person
These people are billionaires who expect to get everything for free. Rules are for the plebs, just take it already
I’m imagining a sci-fi spin on this where AI generators are used to keep AI crawlers in a loop, and they accidentally end up creating some unique AI culture or relationship in the process.
I guess this is what the first iteration of the Blackwall looks like.
Gotta say "AI Labyrinth" sounds almost as cool.