this post was submitted on 19 May 2025
1594 points (100.0% liked)

Microblog Memes

7976 readers
2599 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 2 weeks ago (11 children)

We weren't verifying things with our own eyes before AI came along either, we were reading Wikipedia, text books, journals, attending lectures, etc, and accepting what we were told as facts (through the lens of critical thinking and applying what we're told as best we can against other hopefully true facts, etc etc).

I'm a Relaxed Empiricist, I suppose :P Bill Bailey knew what he was talking about.

[–] [email protected] 6 points 2 weeks ago

All of those have (more or less) strict rules imposed on them to ensure the end recipient is getting reliable information, including being able to follow information back to the actual methodology and the data that came out of it in the case of journals.

Generative AI has the express intention of jumbling its training data to create something "new" that only has to sound right. A better comparison to AI would be typing a set of words into a search engine and picking the first few links that you see, not scientific journals.

load more comments (10 replies)
[–] [email protected] 7 points 2 weeks ago (8 children)

My hot take on students graduating college using AI is this: if a subject can be passed using ChatGPT, then it's a trash subject. If a whole course can be passed using ChatGPT, then it's a trash course.

It's not that difficult to put together a course that cannot be completed using AI. All you need is to give a sh!t about the subject you're teaching. What if the teacher, instead of assignments, had everyone sit down at the end of the semester in a room, and had them put together the essay on the spot, based on what they've learned so far? No phones, no internet, just the paper, pencil, and you. Those using ChatGPT will never pass that course.

As damaging as AI can be, I think it also exposes a lot of systemic issues with education. Students feeling the need to complete assignments using AI could do so for a number of reasons:

  • students feel like the task is pointless busywork, in which case a) they are correct, or b) the teacher did not properly explain the task's benefit to them.

  • students just aren't interested in learning, either because a) the subject is pointless filler (I've been there before), or b) the course is badly designed, to the point where even a rote algorithm can complete it, or c) said students shouldn't be in college in the first place.

Higher education should be a place of learning for those who want to further their knowledge, profession, and so on. However, right now college is treated as this mandatory rite of passage to the world of work for most people. It doesn't matter how meaningless the course, or how little you've actually learned, for many people having a degree is absolutely necessary to find a job. I think that's bullcrap.

If you don't want students graduating with ChatGPT, then design your courses properly, cut the filler from the curriculum, and make sure only those are enrolled who are actually interested in what is being taught.

[–] [email protected] 4 points 2 weeks ago

You get out of courses what you put into it. Throughout my degrees ive seen people either go climb the career ladder to great heights or fail a job interview and work a mcjob. All from the same course.

No matter the course, there will always be some students who will find ingenious ways to waste it.

load more comments (7 replies)
[–] [email protected] 6 points 2 weeks ago (2 children)

So it’s ok for political science degrees then?

load more comments (2 replies)
[–] [email protected] 6 points 2 weeks ago

I'm a slow learner, but I still want to learn.

[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago) (14 children)

Did the same apply when calculators came out? Or the Internet?

load more comments (14 replies)
[–] [email protected] 5 points 2 weeks ago

Yes! Preach!

[–] [email protected] 5 points 2 weeks ago

don't worry, you can become president instead

[–] [email protected] 5 points 2 weeks ago (1 children)

I literally just can't wrap my AuDHD brain around professional formatting. I'll probably use AI to take the paper I wrote while ignoring archaic and pointless rules about formatting and force it into APA or whatever. Feels fine to me, but I'm but going to have it write the actual paper or anything.

load more comments (1 replies)
[–] [email protected] 4 points 2 weeks ago (1 children)

Cries in "The Doctor" from Voyager.

[–] [email protected] 4 points 2 weeks ago (2 children)

The Doctor would absolutely agree. He was intended to be a short-term assistant when a doctor wasn't available, and he was personally affronted when he discovered that he wouldn't be replaced by a human in any reasonable amount of time.

load more comments (2 replies)
[–] [email protected] 4 points 2 weeks ago

If we are talking about critical thinking, then I would argue that using AI to battle the very obvious shift that most instructors have taken, (that being the use of AI as much as possible to plan out lessons, grade, verify sources.......you know, the job they are being paid to do? Which, by the way, was already being outsourced to whatever tools they had at their disposal. No offense TAs.) as natural progression.

I feel it still shows the ability to adapt to a forever changing landscape.

Isn't that what the hundred-thousand dollar piece of paper tells potential employers?

load more comments
view more: ‹ prev next ›