this post was submitted on 04 Jul 2025
18 points (100.0% liked)
SneerClub
1147 readers
103 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
See our twin at Reddit
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This, and similar writing I've seen, seems to make a fundamental mistake in treating time like only the next few, decades maybe, exist, that any objective that takes longer than that is impossible and not even worth trying, and that any problem that emerges after a longer period of time may be ignored.
It has a valid point that anyone trying to convince you that they can bring a society of the very far future within your own lifetime, if only you do as they say or give them money, is either scamming you or fooled themselves out of desperation to see the end results without the work being put in (understandable when the work required for that society to exist will barring an unlikely miracle of technology take far longer than anyone currently alive now will live), and perhaps that some of this tendency comes from speculating about the possibilities of the future without as much thought to the practicalities of getting there.
It also, I think, is generally correct in that trying to build a more sustainable society is of high importance.
However, at the same time, the kind of future vision it and similar things I've read seem to imply they want people to imagine instead don't really make much sense past the next few decades or maybe centuries. Sustainability is important, sure, but it should be noted that, thermodynamics being what it is, it's also not truly possible. You cannot design a civilization that can persist beyond a certain point without outside input, so if you're trying to think about what paths to take in the future, and value people and societies, it is eventually imperative to acquire resources from outside. This won't bring you utopia forever, but it should bring you more and longer. It also isn't really fair to say that things like space development cannot happen simply because they are very hard, and haven't happened yet despite science fiction sometimes showing it as having happened in the near future. There has simply not been enough time. This kind of science fiction has existed for what? Decades? Maybe a century depending on what you count? Just developing the "easiest" (relatively speaking) parts of the solar system is a task of centuries, getting anything meaningful even barely beyond the solar system one of millenia, and actually controlling the galaxy, assuming it truly does turn out to be empty for some descendant of us to control, one of millions of years, simply on account of the scale of it. It is far too premature to say these things cannot happen just because they haven't in decades. It also is a bit absurd to claim that they won't happen because they simply cannot be done: it is very hard to build a society in a place as desolate as, well, anywhere off earth, and we don't yet have the know-how to do it, but we know that a system capable of sustaining life and civilization over the timescales needed to move through space can exist, because earth is already an example of such a system. Given that it isn't even engineered, it is highly unlikely to be the smallest or simplest possible example of such a system either. We have a society that exists drifting in space already, to replicate and expand it breaks no physical laws, else it couldn't have happened in the first place and I wouldn't be saying this.
There is a certain irony to everyone involved in this argument, if it can be called that. Those who like to think about what could be achieved in the very far future tend, in my experience (I have a very strong interest in such things myself and so hang out in some spaces for such discussion) to have an extremely overoptimistic notion of the timeline, steps and work involved and so seem to think that it will all happen tomorrow, figuratively speaking, without much need to contribute to the actual process of achieving it (or else they have a very grandiose and often counterproductive notion of what that looks like). Meanwhile, those that suspect it's all impossible dreaming that distracts from the immediate problems facing society probably make these things more likely to happen, partly because to build the society of the far future, society has to continue to exist and function from now until then, and partly because knowledge and practical experience in how to maintain a climate, keep industrial activity from destroying it, and not use up nonrenewable resources in a few decades, is also one of the exact kinds of expertise any society built beyond earth someday would need to have. The work of building the future is, for the average person, just in keeping society working long enough for incremental improvements to stack up, and that simply isn't that exciting even if it's the logical prerequisite for much exciting to happen.
It's doubly ironic that many of the people that think that if everyone just listens to them they could do it all quickly, think of themselves as thinking about the long term by doing so. Truely long term planning requires patience and flexibility beyond what any one person is probably capable of at present. At the same time, this doesn't mean that we shouldn't try to imagine a much greater future than what we have, or to put in the work for it, if we can be realistic about what progress we can expect to watch happen and what that work really is.
don’t do this debatefan here crap here, thanks
this isn’t the article you’re thinking of. this article is about Silicon Valley technofascists making promises rooted in Golden Age science fiction as a manipulation tactic. at no point does the article state that, uh, long-term objectives aren’t worth trying because they’d take a long time??? and you had to ignore a lot of the text of the article, including a brief exploration of the techno-optimists and their fascist ties (and contrasting cases where futurism specifically isn’t fascist-adjacent), to come to the wrong conclusion about what the article’s about.
unless you think the debunked physics and unrealistic crap in Golden Age science fiction will come true if only we wish long and hard enough in which case, aw, precious, this article is about you!
That wasn't a debate, I just find it easier to process my thoughts about something by writing about what I think of it in a semi-formal way (not because I expect people to actually read whatever I ramble on about, but because it's easier for me to go back to it myself if it visually looks less informal). I'm not trying to actually debate anyone here, wherever "here" is (I don't remember if I've seen this community on my feed before, I don't generally pay that much attention to what specific community something is in but looking at it upon it coming up, I'm a bit confused about what this one is). I don't have a lot of self control about replying to things and find it very hard not to get into arguments if I find something I disagree with, but I don't actually enjoy arguing and don't have the skills for formalized "debate" anyway.
I'm not saying that the article directly stated these things, they're the subtext that I got out of reading it, and mostly come from the later parts of the text or just from other stuff I've read that felt like it had the same sort of message (because again, I'm not trying to actually prove anything to someone with all this). I did read the earlier parts too, yes, but I guess they seemed obvious enough that they didn't stick in my head as much as the later bits. Of course facists lie, that's pretty much the only thing their ideology runs on, that's less interesting to me than the bits at the end where they mentioned, for example, Kim Stanley Robinson. I know golden age sci-fi runs on magic, but I also think that things of similar scale and potential are possible within the already known physics of the real world, eventually.
fair enough!
it’s ok, nobody does. that’s why we ban it unless it’s amusing (which effectively bans debate for everyone unless they know their audience well enough to not fuck up) — shitty debatelords take up a lot of thread space and mental energy and give essentially nothing back.
SneerClub is a fairly old community if you count in its Reddit origins; part of what we do here is sneering at technofascists and other adherents to the TESCREAL belief package, though SneerClub itself tends to focus on the LessWrong Rationalists. that’s the context we tend to apply to articles like the OP.
Hmm, you probably wouldn't/shouldn't like me particularly much then, I've not heard that term before but I do very strongly consider myself a transhumanist at the very least. I suppose I won't try to explain why that is, if it's both a place that doesn't like that stuff but also bans arguing. Feel free to laugh at my hubris I guess, if that's what you mean by sneering? Though I guess I can get the part about silicon valley types, if the technology to do what I would actually want to do with myself came out but it was developed and sold by like, Elon Musk or such I'm not touching it with the world's longest pole.
Re being a transhumanists: Well that is why I personally dont like the term tescreal, as I do think there can be good transhumanists out there. The hot swappable furry sex organ transhumanists vs the I just want to code javascript faster transhumanists. (Meant as strawman of both, but Id prefer the first, esp as they are more likely to go the 'it should be available to all of us or none of us' route).
Personally I dont think I would mind living forever (or untill Im tired of it) given everyone gets the choice we stay healthy and have free stuff for example. But that will not happen under the Muskians/Rationalists/etc. (im remembering the moment I discovered one of the H+ people was spouting anti trans lines for example, they just want to personally live forever and fuck everyone else it seems).
E: ah they got told to leave.
also fair enough. you might still enjoy a scroll through our back archive of threads if you’ve got time for it — there is a historical context to transhumanism that people like Musk exploit to further their own goals, and that’s definitely something to be aware of, especially as TESCREAL elements gain overt political power. there are positive versions of transhumanism and the article calls one of them out — the Culture is effectively a model for socialist transhumanism — but one must be familiar with the historical baggage of the philosophy or risk giving cover to people currently looking to cause harm under transhumanism’s name.
I dont have a whole lot of time tonight as it happens, though I'll look through them for a little to see if I find what youre talking about. Im not sure if you know of the youtube channel "philosphy tube" or not, or if they're liked or disliked here if so, but they did a fairly critical video about the topic once that I seem to remember touching on some nasty characters involved in the beginning of it, if that kind of thing is what you mean. I do understand that there is a lot of potential harm you can do with the idea, both if implemented in a bad manner or by someone interested in abusing the power it implies, or in the nearer term using the hypothetical promise of the idea as a ruse to gain support for like, eugenics or such, but I think that there is enough potential good if done well that attempting to do it well is worth a (probably careful) try.
no problem! I don’t mean to give you homework, just threads to read that might be of interest.
yeah, a few of us are Philosophy Tube fans, and I remember they’ve done a couple of good videos about parts of TESCREAL — their Effective Altruism and AI videos specifically come to mind.
if you’re familiar with Behind the Bastards, they’ve done a few videos I can recommend dissecting TESCREAL topics too:
Ive actually spent the last hour and a half or so looking back at the community a bit, especially its top posts (I really ought to sleep tbh and hence really shouldnt, but I have a trip tomorrow and so Im not really able to fall asleep anyway, oh well). I will say, that its something of a bizarre mix to me of very familiar and unfamiliar. After getting some more of an idea of what this place is, I really dont think Id really be terribly welcome here as I think I fit, like, maybe half of your acronym, more or less to a T, (but strongly disagree with the other half of it, which confusingly to me a bunch of comments on the top posts seem to suggest people think isnt possible if one accepts the others? Maybe Im reading into hyperbole or sarcasm there, Im not so good with those things). You could probably find one or two things in my comment history to point and laugh at, though I dont recommend looking there simply on account of the rambling way I talk when these things come up and all the other stuff youd have to look through to find it.
I also think I may have encountered not this community, but another community on the same instance (the techtakes one) once before, and was similarly confused about it but didnt really get what it was even after having it described, because I was unfamiliar with the reddit version or what "LessWrong" was (though I am familiar with Roko's Basilisk, which sounds like its from there? I think its one of the most laughably stupid ideas Ive come across in AI speculation discussions, but I have heard of it). Looking through your community I get a bizarre sense looking at what they apparently believe of whiplash, in that I feel like I agree with a bunch of their initial premises (to include a lot of the weirder ones that I suspect you all dislike, like utilitarianism and some of the counter-intuitive things that it can lead to on large scales) and then suddenly find they come to a conclusion that is wildly different or even inverse from the one I get. A sort of "No no thats not what that implies how the hell can you do futurism that wrong!" vibe. It might be something mundane like political values maybe, a lot of these guy seems like right wing types and I tend towards a variety of socialism.
Something I did find slightly interesting, is that I havent heard of practically any of the people that seem to recurringly come up in the top posts, like this Yud guy (except for the rich and widely reported on ones), despite seeming like at least some of my ideas would have led me to cross paths with someone that follows them. At the same time, I was somewhat surprised the person who's community I interact with the relevant topics most in didnt seem to come up (Isaac Arthur, I even searched his name out of curiosity, but didnt get any results, on the lemmy at least, i didnt try that on the reddit as I dont like going back to that platform. Maybe hes less laughable to you guys because he presents things more as a "These are the sorts of crazy sounding things a really advanced civilization might do" rather than a call to try to do them right this moment available tech notwithstanding, or maybe because he doesnt bring up his, admittedly disagreeable to me, politics up much? Regardless of that there are at least some people Ive met on his discord that sound a lot like the guys the top posts talk about lol, like some guy that is convinced chatGPT is somehow already sentient and also already smarter than any human).
I also have a bit of a conundrum with this: I dont want to block the instance, as I do think criticism of ideas I sometimes hold, even if its just pointing out how weird sounding they are to most people, is valuable, and stuff about the ones I dont hold can at least serve as examples of lines of reasoning to avoid or at least consider is a grain of salt. Its also kind of entertaining. On the other hand, I know myself well enough to know that I have very little self control and that if I see something I disagree with, especially involving a topic I care about (and some portion of the stuff that comes up here fits that), I have a hard time not ending up arguing with people about it. I dont want to do that, I waste way too much time on it and get majorly stressed out doing it, and it sounds like here its against the community rules anyway, but I also get anxious leaving a thought unsaid and usually end up spending two hours trying to find every analogy and way of explaining that I can to preclude misunderstanding, which, obviously, fails because at that point I have a rambling wall of text again. I also use lemmy fairly casually on the /all feed (it feels to small to spend time in just one or a few communities), and always forget to check where something is before jumping into a discussion, so knowing myself I'll probably end up breaking that rule unintentionally at some point. I dont suppose anyone would know of a way to like, mark an instance somehow so as to have a visual highlight on it or warning before posting a response there to "look where you're posting, dummy" or toggle the ability to make comments there or something? Probably not Im guessing, but with FOSS stuff you never know what kind of feature someone might have made. Alternately I guess I could like, ask for an instance ban or something, if that doesnt make the instance un-viewable from my account (I have no idea if it does or not), but that might be a bit far to deal with my own lack of internet discipline, I dunno.
hey no problem, we’ve got systems in place for this kind of thing. happy trails.
(though for the record, re the idea that right-wing posters are allowed in here without being told to go fuck themselves: lol)