prototype_g2

joined 1 year ago
[–] [email protected] 13 points 2 months ago (1 children)

Not if you are part of the AI-bros club. There is a reason Marketing agencies insist in using the term Artificial Intelligence.

Unfortunately, this is not common knowledge, as experts and Marketing Agencies explain Machine Learning to the masses by saying that "It looks at the data and learns from it, like a human would", which combined with the name Artificial Intelligence and the other terms, like Neural Networks and Machine Learning can make someone think these things are actually intelligent.

Furthermore, we, humans, can see humanity where there is none. We can see faces where there are no faces, we can empathize with things that aren't even alive. So, when this thing shows up, which is capable of creating somewhat coherent text, people are quick to Anthropomorphize the machine. To add to this, we are also very language focused: If someone is really good with the language they speak, they are usually seen as more intelligent.

And finally, never underestimate tech illiteracy.

[–] [email protected] 2 points 2 months ago (1 children)

That is true. Take, for example, movies. Cinema studious with big budgets are usually very risk averse, simply due to the cost of failure being so high. So they have to make sure they can turn a profit. But how can you make sure any given thing will be profitable? Well, that is a prediction, and to predict anything, you need data to base that prediction on. Predictions are based on past events. And so they make sequel after sequel. They make things that have been proven to work. New things, by virtue of being new, don't have tons of data (past examples) for them to make good predictions and so they avoid new things. This results in the homogenization of art. Homogenization induced by Capital, has Capital only sees value in profit, and thus, for Capital, only predictably profitable art is given the resources to flourish.

Machine Learning made images art the epiphany of this. All output is based on previous input. The machine is constructed to not deviate too much from the training data (loss function). And thus struggles to do things it does not have much data on, like original ideas.

I think that what we’re likely to see are parallel worlds of art. The first and biggest being the homogenous, public and commercial one which we’re seeing now but with more of it produced by machines, and the other a more intimate, private and personal one that we discover by tuning back into our real lives and recognising art that has been made by others who are doing the same.

That's kind of already a thing. Just without the AI. Like in the example above, Capital wants predictable profit. Therefore only the most widely appealing, proven to be profitable art will get significant budgets. Creative and unique ideas are just too risky, and therefore delegated to the indie space, where, should any ever become successful, Capital is willing to help... Under the condition they get all the money (Think, for example, how Spotify takes most of the revenue made by the songs they distribute).


By "Capital" I mean those who own things necessary to produce value.

[–] [email protected] 4 points 2 months ago

You have yet to refute the deduction based argument:

If you use the machine to think for you, you will stop thinking.

Not thinking leads to a degradation of thinking skills

Therefore, using machine to think for you will lead to a degradation of thinking skills.

This is not inductive reasoning, like a study, where you look at data and induce a conclusion. This is pure reasoning. Refute it.

That’s a lot of bon-scientific blogs to talk about the non-scientific study I pointed out. Still no objective evidence.

They are a bunch of blogs of people sharing that, after utilizing AI for extended periods of time, their ability to solve problems degraded because they stopped thinking and sharpening their cognitive skills.

So what would satisfy your need for objective evidence? What would I need to show you for you to change your mind? How would a satisfactory study be conducted?

I didn’t say much about the “hominem” but I think you’re defining Microsoft?

"Defining Microsoft"... I didn't define Microsoft?

Did you mean "Defend"? What do you mean "defend"? Again, ad hominem. Instead of substantiating why it is you say the document doesn't count, you attack the ones who made it.


All your dismissals and you have yet to refute the argument all these people make:

If you use the machine to think for you, you will stop thinking.

Not thinking leads to a degradation of thinking skills

Therefore, using machine to think for you will lead to a degradation of thinking skills.

All you have to do is refute this argument and my then it will be up to me to defend myself. Refute the argument. It's deductive reasoning.

[–] [email protected] 6 points 2 months ago (2 children)

The classic Ad Hominem. Instead of actually refuting the arguments, you instead attack the ones making them.

So, tell me, which part of "As Bainbridge [7] noted, a key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise.” is affected by the conflict of interests with the company? This is a note made by Bainbridge. The argument is as follows

If you use the machine to think for you, you will stop thinking.

Not thinking leads to a degradation of thinking skills

Therefore, using machine to think for you will lead to a degradation of thinking skills.

It is not too hard to see that if you stop doing something for a while, your skill to do that thing will degrade overtime. Part of getting better is learning from your own mistakes. The AI will rob you those learning experiences.

What is the problem with the second quote? It is not an opinion, it is an observation.

Other's have noticed this already:

https://www.darrenhorrocks.co.uk/why-copilot-making-programmers-worse-at-programming/

https://www.youtube.com/watch?v=8DdEoJVZpqA

https://nmn.gl/blog/ai-illiterate-programmers

https://www.youtube.com/watch?v=cQNyYx2fZXw


This, of course, only happens if you use the AI to think for you.

[–] [email protected] 11 points 2 months ago (4 children)

Microsoft did a study on this and they found that those who made heavy usage of AI tools said they felt dumber:

"Such consternation is not unfounded. Used improperly, technologies can and do result in the deterioration of cognitive faculties that ought to be preserved. As Bainbridge [7] noted, a key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise."

Cognitive ability is like a muscle. If it is not used regularly, it will decay.

It also said it made people less creative:

"users with access to GenAI tools produce a less diverse set of outcomes for the same task, compared to those without. This tendency for convergence reflects a lack of personal, contextualised, critical and reflective judgement of AI output and thus can be interpreted as a deterioration of critical thinking."

LINK

[–] [email protected] 2 points 2 months ago (1 children)
[–] [email protected] 22 points 2 months ago

If all it takes to be a “real artist” is drawing proficiently

I think you are miss-understanding the argument.

Pro-AI folk say that being anti-AI, as a digital artist, is hypocrisy because you also used a computer. Here it is shown that, despite not using a computer, the artist is still able to create their art, because there is more to the visual arts than the tools you have to make it. This puts rest to the idea that using digital art tools is somehow hypocritical with being against AIGen.

The argumentor is not saying that not knowing how to draw proficiently excludes being an artist. They are just saying that real artist do not need a computer program to create their arts, much like performances or installation artists you mentioned.

[–] [email protected] 1 points 2 months ago

nor do I have the talent

And why do you think you do not have "talent"? What is that "talent" you speak of? Is it something people are born with? What is the problem with what you make, if all you care about is what people put into art?

Art is whatever people put into it

"It" what? The pronoun "it" is referring to what? Art? Without this clarification I cannot accurately make sense of anything else in your response.

Keep in mind that, while defining a term, you cannot use that term in it's own definition.

[–] [email protected] 5 points 2 months ago (4 children)

Art can be automated

Under what definition of art can that be possible? Is art to you nothing more than an image? Why automate art and not other tasks? What is the point of automating art? Why would you not want to make art yourself and instead delegate it to a machine?

[–] [email protected] 5 points 2 months ago (4 children)

This is a matter of coding a good enough neuron simulation, running it on a powerful enough computer, with a brain scan we would somehow have to get - and I feel like the brain scan is the part that is farthest off from reality.

So... Sci-Fi technology that does not exist. You think the "Neurons" in the Neural Networks of today are actually neuron simulations? Not by a long shot! They are not even trying to be. "Neuron" in this context means "thing that holds a number from 0 to 1". That is it. There is nothing else.

That’s an unnecessary insult - I’m not advocating for that, I’m stating it’s theoretically possible according to our knowledge, and would be an example of a computer surpassing a human in art creation. Whether the simulation is a person with rights or not would be a hell of a discussion indeed.

Sorry about the insulting tone.

I do also want to clarify that I’m not claiming the current model architectures will scale to that, or that it will happen within my lifetime. It just seems ridiculous for people to claim that “AI will never be better than a human”, because that’s a ridiculous claim to have about what is, to our current understanding, just a computation problem.

That is the reason why I hate the term "AI". You never know whether the person using it means "Machine Learning Technologies we have today" or "Potential technology which might exist in the future".

And if humans, with our evolved fleshy brains that do all kinds of other things can make art, it’s ridiculous to claim that a specially designed powerful computation unit cannot surpass that.

Yeah... you know not every problem is compute-able right? This is known as the halting problem.

Also, I'm not interested in discussing Sci-Fi future tech. At that point we might as well be talking about Unicorns, since it is theoretically possible for future us to genetically modify a equine an give it on horn on the forehead.


Also, why would you want such a machine anyways?

[–] [email protected] 14 points 2 months ago (6 children)

It’s not a matter of if “AI” can outperform humans, it’s a matter of if humanity will survive to see that and how long it might take.

You are not judging what is here. The tech you speak of, that will surpass humans, does not exist. You are making up a Sci-Fi fantasy and acting like it is real. You could say it may perhaps, at some point, exist. At that point we might as well start talking about all sorts of other technically possible Sci-Fi technology which does not exist beyond fictional media.

Also, would simulating a human and then forcing them to work non-stop count as slavery? It would. You are advocating for the creation of synthetic slaves... But we should save moral judgement for when that technology is actually in horizon.

AI is a bad term because when people hear it they start imagining things that don't exist, and start operating in the imaginary, rather than what actually is here. Because what is here cannot go beyond what is already there, as is the nature of the minimization of the Loss Function.

[–] [email protected] 3 points 2 months ago

I’m not sure what the rest of the message has to do with the fundamental assertion that ai will never, for the entire future of the human race, “outperform” a human artist. It seems like it’s mostly geared towards telling me I’m dumb.

I is my attempt at an explanation of how the machine fundamentally works, which, as an obvious consequence of it's nature, cannot but mimic. I'm pretty sure you do not know the inner workings of the "Learning", so yes... I'm calling you incompetent... in the field of Machine Learning. I even gave you a link to a great in depth explanation of how these machines work! Educate yourself, as for your ignorance (in this specific field) to vanish.

Correct, humans are flesh bags. Prove me wrong?

  1. Human is "A member of the primate genus Homo, especially a member of the species Homo sapiens, distinguished from other apes by a large brain and the capacity for speech. "

  2. Flesh is "The soft tissue of the body of a vertebrate, covering the bones and consisting mainly of skeletal muscle and fat. "

  3. Flesh does not have brains or the capacity for speech

  4. Therefore, Humans are not flesh

I supposed I should stop wasting my time talking to you then, as you see me as nothing more than an inanimate object with no consciousness or thoughts, as is flesh.

view more: next ›