chaos

joined 2 years ago
[–] chaos@beehaw.org 2 points 1 week ago (1 children)

Even if the right move was "give up and do what the Republicans want," they still did a terrible job. House Democrats held the line and stuck their necks out, only to get blindsided, and Schumer shouldn't have signaled that there'd be a fight right before he caved. The left hand doesn't know what the right hand is doing, and only a handful of people in the party seem to even be trying to do anything.

[–] chaos@beehaw.org 10 points 1 week ago (1 children)

Yeah, they're probably talking about nulls. In Java, object references (simplified pointers, really) can be null, pointing nowhere and throwing an exception if you try to access them, which is fine when you don't have a value for that reference (for example, you asked for a thing that doesn't exist, or you haven't made the thing yet), but it means that every time you interact with an object, if it turns out to have been null, a null pointer exception is getting thrown and likely crashing your program. You can check first if you think a value might be null, but if you miss one, it explodes.

Kotlin has nulls too, but the type system helps track where they could be. If a variable can be null, it'll have a type like String?, and if not, the type is String. With that distinction, a function can explicitly say "I need a non-null value here" and if your value could be null, the type system will make you check first before you can use it.

Kotlin also has some nice quality of life improvements over Java; it's less verbose (not a hard task), doesn't force everything to belong to a class, supports data classes which are automatically immutable and behave more like primitive values than objects, and other improvements.

[–] chaos@beehaw.org 3 points 3 weeks ago (1 children)

I want my tax dollars to be used for something useful, not buying special little numbers

[–] chaos@beehaw.org 7 points 3 weeks ago

The only objection to that is that first episodes are usually pilots where they haven't sorted out everything and have to spend the whole episode setting up the premise of the show, not actually doing the show. Episode 2 is a lot more likely to be representative of everything that comes after it. But other than that, yeah, of course, just walk away, hit da bricks, real winners quit.

[–] chaos@beehaw.org 4 points 3 weeks ago

I'm still using an iPhone mini and I haven't experienced any bad layouts, broken websites, or any difficulty like that. It has the same resolution of the biggest iPhone I've ever had (iPhone X) so things are smaller, which would make it a poor fit for someone with poor vision, but for me it's an absolutely perfect phone. It's frustrating to know that the perfect phone for me could easily exist, and yet Apple will refuse to make it for me. I'll be stuck with phones I don't like for the rest of my life, it seems.

[–] chaos@beehaw.org 1 points 3 weeks ago

I empathize with that frustration. The process of thinking you're right, learning you're wrong, and figuring out why is very fundamentally what coding is. You're taking an idea in one form (the thing you want to happen in your mind) and encoding it into another, very different form, a series of instructions to be executed by a computer, and your first try is almost always slightly wrong. Humans aren't naturally well-adapted to this task because we're optimized for instructing other humans, who will usually do what they think you mean and not always what you actually said, can gloss over or correct small mistakes or inconsistencies, and will act in their own self-interest when it makes sense, but a computer won't behave that way, it requires you to bend completely to how it works. It probably makes me a weirdo, but I actually like that process, it's a puzzle-solving game for me, even when it's frustrating.

I do think asking an AI for help with something is a useful way to use it, that really isn't all that different from checking a forum (in fact, those forums are probably what it's drawing from in the first place), and hallucinations aren't too damaging because you'll be checking the AI's answer when you try what it says and see if it works. It's more the blindly accepting code that it produces that I think is harmful (and you aren't doing that, it sounds like.) In an IDE it's really easy to quickly make pages of code without engaging the brain, and it works well enough to be very tempting, but not, as I'm sure you know, well enough to do the whole thing.

[–] chaos@beehaw.org 1 points 3 weeks ago (2 children)

Yeah, totally fair. I'll note that you're kind of describing the typical software development process of a customer talking to the developer and developing requirements collaboratively with them, then the developer coming back with a demo, the customer refining by going "oh, that won't work, it needs to do it this way" or "that reminds me, it also needs to do this", and so on. But you're closer to playing the role of the customer in this scenario, and acting like more of an editor or manager on the development side. The organizers of a game jam could make a reasonable argument that doing it this way is akin to signing up for the game jam, coming up with an idea, then having your friend who isn't signed up for the game jam implement it for you, when the point is to do it all in person, quickly, in a fun and energetic environment. The people doing a game jam like coding, that's the fun part for them, so someone signing up and skipping all that stuff does have a little bit of a "why are you even here then" aspect to it. Of course it depends on the degree the AI is being used, how much editorial control or tweaking you're doing, it's a legitimate debate and I don't think you're wrong to want to participate.

[–] chaos@beehaw.org 3 points 4 weeks ago (4 children)

I'll acknowledge that there's definitely an element of "well I had to do it the hard way, you should too" at work with some people, and I don't want to make that argument. Code is also not nearly as bad as something like image generation, where it's literally just typing a thing and getting a not-very-good image back that's ready to go; I'm sure if you're making playable games, you're putting in more work than that because it's just not possible to type some words and get a game out of it. You'll have to use your brain to get it right. And if you're happy with the results you get and the work you're doing, I'm definitely not going to tell you you're doing it wrong.

(If you're trying to make a career of software engineering or have a desire to understand it at a deeper level, I'd argue that relying heavily on AI might be more of a hindrance to those goals than you know, but if those aren't your goals, who cares? Have fun with it.)

What I'm talking about is a bigger picture thing than you and your games; it's the industry as a whole. Much like algorithmic timelines have had the effect of turning the internet from something you actively explored into something you passively let wash over you, I'm worried that AI is creating a "do the thinking for me" button that's going to be too tempting for people to use responsibly, and will result in too much code becoming a bunch of half-baked AI slop cobbled together by people who don't understand what they're really doing. There's already enough cargo culting around software, and AI will just make it more opaque and mysterious if overused and over-relied on. But that's a bigger picture thing; just like I'm not above laying back and letting TikTok wash over me sometimes, I'm glad you're doing things you like with the assistance you get. I just don't want that to become the only way things happen either.

[–] chaos@beehaw.org 7 points 4 weeks ago (6 children)

The irony is that most programmers were just googling and getting answers from stackoverflow, now they don't even need to Google.

That's the thing, though, doing that still requires you to read the answer, understand it, and apply it to the thing you're doing, because the answer probably isn't tailored to your exact task. Doing this work is how you develop an understanding of what's going on in your language, your libraries, and your own code. An experienced developer has built up those mental muscles, and can probably get away with letting an AI do the tedious stuff, but more novice developers will be depriving themselves of learning what they're actually doing if they let the AI handle the easy things, and they'll be helpless to figure out the things that the AI can't do.

Going from assembly to C does put the programmer at some distance from the reality of the computer, and I'd argue that if you haven't at least dipped into some assembly and at least understand the basics of what's actually going on down there, your computer science education is incomplete. But once you have that understanding, it's okay to let the computer handle the tedium for you and only dip down to that level if necessary. Or learning sorting algorithms, versus just using your standard library's sort() function, same thing. AI falls into that category too, I'd argue, but it's so attractive that I worry it's treating important learning as tedium and helping people skip it.

I'm all for making programming simpler, for lowering barriers and increasing accessibility, but there's a risk there too. Obviously wheelchairs are good things, but using one simply "because it's easier" and not because you need to will cause your legs to atrophy, or never develop strength in the first place, and I'm worried there's a similar thing going on with AI in programming. "I don't want to have to think about this" isn't a healthy attitude to have, a program is basically a collection of crystallized thoughts and ideas, thinking it through is a critical part of the process.

[–] chaos@beehaw.org 18 points 1 month ago

The entire country shifted red. They would've had to implement this system in all 50 states, even ones that didn't matter, and across 50 different voting systems, many of which are entirely paper-based, and not leave a single scrap of evidence. Individual ballots are secret, but lots of other records are not, including who did and did not vote in each precinct, and how many ballots were cast for each candidate, so if they were just injecting lots of fake ballots, the numbers wouldn't add up. The simple fact is, the 2020 election wasn't stolen, and neither was the 2024 one.

[–] chaos@beehaw.org 8 points 2 months ago

I see this as an accessibility problem, computers have incredible power but taking advantage of it requires a very specific way of thinking and the drive to push through adversity (the computer constantly and correctly telling you "you're doing it wrong") that a lot of people can't or don't want to do. I don't think they're wrong or lazy to feel that way, and it's a barrier to entry just like a set of stairs is to a wheelchair user.

The question is what to do about it, and there's so much we as an industry should be doing before we even start to think about getting "normies" writing code or automating their phones. Using a computer sucks ass in so many ways for regular people, you buy something cheap and it's slow as hell, it's crapped up with adware and spyware out of the box, scammers are everywhere ready to cheat you out of your money... anyone here is likely immune to all that or knows how to navigate it but most people are just muddling by.

If we got past all that, I think it'd be a question of meeting users where they are. I have a car but I couldn't replace the brakes, nor do I want to learn or try to learn, but that's okay. My car is as accessible as I want it to be, and the parts that aren't accessible, I go another route (bring it to a mechanic who can do the things I can't). We can do this with computers too, make things easy for regular people but don't try to make them all master programmers or tell them they aren't "really" using it unless they're coding. Bring the barrier down as low is it can go but don't expect everyone to be trying to jump over it all the time, because they likely care about other things more.

[–] chaos@beehaw.org 4 points 2 months ago

I'm so confused that the same people can say "why does everyone get their undies in a bunch that we happily accept putting arbitrary data in columns regardless of type, that's good, it's flexible, but fine, we'll put in a 'strict' keyword if you really want column types to mean something" and also "every other SQL says 1=='1' but this is madness, strings aren't integers, what is everyone else thinking?!"

view more: next ›