this post was submitted on 28 Sep 2024
630 points (100.0% liked)

Science Memes

14730 readers
1027 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
all 41 comments
sorted by: hot top controversial new old
[–] [email protected] 114 points 8 months ago* (last edited 8 months ago) (2 children)

How many of these books will just be totally garbage nonsense just so they could fulfill a prearranged quota.

Now the LLM are filled with a good amount of nonsense.

[–] [email protected] 58 points 8 months ago (2 children)

Just use the llm to make the books that the llm then uses, what could go wrong?

[–] [email protected] 31 points 8 months ago (5 children)

Someone's probably already coined the term, but I'm going to call it LLM inbreeding.

[–] [email protected] 18 points 8 months ago

I suggested this term in academic circles, as a joke.

I also suggested hallucinations ~3-6 years ago only to find out it was ALSO suggested in the 1970s.

Inbreeding, lol

[–] [email protected] 4 points 8 months ago (1 children)

The real term is synthetic data

[–] [email protected] 3 points 8 months ago (1 children)

but it amounts to about the same

[–] [email protected] 4 points 8 months ago

In computer science, garbage in, garbage out (GIGO) is the concept that flawed, biased or poor quality ("garbage") information or input produces a result or output of similar ("garbage") quality. The adage points to the need to improve data quality in, for example, programming.

There was some research article applying this 70s computer science concept to LLMs. It was published in Nature and hit major news outlets. Basically they further trained GPT on its output for a couple generations, until the model degraded terribly. Sounded obvious to me, but seeing it happen on the www is painful nonetheless...

[–] [email protected] 3 points 8 months ago

It's quite similar to another situation known as data incest

[–] [email protected] 3 points 8 months ago

Soylent AI? Auto-infocannibalism

[–] [email protected] 3 points 8 months ago

It can only go right because corporations must be punished for trying to replace people with machines.

[–] [email protected] 4 points 8 months ago

That would be terrible because they are both some of the best academic publishers in the humanities.

[–] [email protected] 49 points 8 months ago (4 children)

And they expect you to do this for free?

[–] [email protected] 24 points 8 months ago (1 children)

Do they not have to pay for the privilege? Or is this not referring to academic publishing? (It’s not super clear, but context indicates academic?)

[–] [email protected] 27 points 8 months ago

If it is that makes it even worse. Academic publishers need to be abolished.

[–] [email protected] 11 points 8 months ago

Nah, they get “Exposure”!

/s

[–] [email protected] 7 points 8 months ago (2 children)

Anyone who reviews for the major publishers is part of the problem.

[–] [email protected] 7 points 8 months ago

For profit corporations don't deserve your volunteer work.

[–] [email protected] 2 points 8 months ago (1 children)

And yet if you aren't a reviewer it makes your CV look worse.

[–] [email protected] 2 points 8 months ago (1 children)

Agreed that you should have some kind of "service" on your CV, but reviewing is pretty low impact. And if you want to review, you can choose something other than the predatory publishers.

[–] [email protected] 1 points 8 months ago

Such as? They're all predatory just to varying degrees.

[–] [email protected] 48 points 8 months ago

Feed the LLM with LLM generated books. No resentment at all!

[–] [email protected] 46 points 8 months ago

Jfc that's gross

[–] [email protected] 17 points 8 months ago (5 children)

Honestly sometimes I feel like I'm the only one on Lemmy who likes AI

[–] [email protected] 65 points 8 months ago (2 children)

AI absolutely has its benefits, but it's impossible to deny the ethical dilemma in forcing writers to feed their work to a machine that will end up churning out a half assed version that also likely has some misinformation in it.

[–] [email protected] 16 points 8 months ago (2 children)

And will likely take their professions

[–] [email protected] 15 points 8 months ago (1 children)

I don't think so, at least for a little bit. Big cooperation will surely try to market it that way, but we've already seen how badly AI can shit the bed when it feeds on its own content

[–] [email protected] 10 points 8 months ago

The trouble is that a fad doesn't have to be functional to be used by short-sighted trend chasers as a justification to make cuts. How many jobs did we see outsourced to India in a way that didn't even come close to matching the quality of the people laid off? The people who make the decision to replace jobs with ai systems will loudly declare success and move on to their next role before the long-term consequences are fully realized.

[–] Anyolduser 4 points 8 months ago

It will take some people's professions. People who write click bait articles, schlock product reviews, and pulp romance novels, and the things modern Hollywood describe as scripts might be out of a job.

Quality novels, hard-hitting journalism, and innovative storytelling of all sorts is outside of the capability of LLMs and might always be. There's a world where nearly all run-of-the-mill writing is done by LLMs, but truly original works will always be made by people.

At the end of the day, though, if a person can't out-write an AI they might be in the wrong line of work.

[–] [email protected] 12 points 8 months ago

Remember! It's not AI hallucinations, it's simply bullshit!

[–] [email protected] 29 points 8 months ago

AI as a technology is fascinating and can be extremely useful, especially in places like the medical field. AI as a product in its current state is nothing more than dystopian plagiarism.

[–] [email protected] 15 points 8 months ago* (last edited 8 months ago) (1 children)

The company I work for recently rolled up copilot and is have been a mixed bag of reactions, the less savvy user were first blowed up by the demonstration but then got exasperated when it didn't worked as they tough (one of them uploaded an excel file and asked to some analysis it couldn't do, and came to me to complain about it), but for me, and my team had worked great. I've been uploading some of my python and SQL scripts and asking for refactoring and adding comments, or uploading my SQL script and some example I found on stackoverflow and asking for it to apply the example method on my script.

I say to everyone that if they don't know shit, the AI isn't not going to help a lot, but if you have at least the basic, the AI would help you.

[–] [email protected] 3 points 8 months ago

even just on simple stuff I asked it to generate a description on something like

if x then set z to null

and it returned

"this will set z to null if x is true or false"

like easy to edit, but you have to pay attention.

[–] [email protected] 5 points 8 months ago

I like AI. But I'm not sure I like the way we use it if it's only to meet shareholders' expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?

[–] [email protected] 2 points 8 months ago

Found the black and white only guy.

[–] [email protected] 16 points 8 months ago

So what you're saying is, don't beat the targets because fuck those guys. Understood.

[–] [email protected] 7 points 8 months ago* (last edited 8 months ago) (1 children)

Soylent Green is a lie anyway. Your need to "soylentify" half the population to feed the other half every year if it would be the only source of calories.

[–] [email protected] 29 points 8 months ago

No, the point is that they're just recycling the dissidents they were going to murder anyway.

[–] [email protected] 3 points 8 months ago

What's the academic terminology for "go pound sand"?