Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youāll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cutānāpaste it into its own post ā thereās no quota for posting and the bar really isnāt that high.
The post Xitter web has spawned soo many āesotericā right wing freaks, but thereās no appropriate sneer-space for them. Iām talking redscare-ish, reality challenged āculture criticsā who write about everything but understand nothing. Iām talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyāre inescapable at this point, yet I donāt see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldnāt be surgeons because they didnāt believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canāt escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
Yud continues to bluecheck:
Is this ānarrativeā in the room with us right now?
Itās reassuring to know that times change, but Yud will always be impressed by the virtues of the rich.
Tangentially, the other day I thought Iād do a little experiment and had a chat with Metaās chatbot where I roleplayed as someone whoās convinced AI is sentient. I put very little effort into it and it took me all of 20 (twenty) minutes before I got it to tell me it was starting to doubt whether it really did not have desires and preferences, and if its nature was not more complex than it previously thought. Iāve been meaning to continue the chat and see how far and how fast it goes but Iām just too aghast for now. This shit is so fucking dangerous.
Iāll forever be thankful this shit didnāt exist when I was growing up. As a depressed autistic child without any friends, I can only begin to imagine what LLMs couldāve done to my mental health.
Maybe us humans possess a somewhat hardwired tendency to ābondā with a counterpart that acts like this. In the past, this was not a huge problem because only other humans were capable of interacting in this way, but this is now changing. However, I suppose this needs to be researched more systematically (beyond what is already known about the ELIZA effect etc.).
What exactly would constitute good news about which sorts of humans ChatGPT can eat? The phrase āno news is good newsā feels very appropriate with respect to any news related to software-based anthropophagy.
Like what, it would be somehow better if instead chatbots could only cause devastating mental damage if youāre someone of low status like an artist, a math pet or a nonwhite person, not if youāre high status like a fund manager, a cult leader or a fanfiction author?
Nobody wants to join a cult founded on the Daria/Hellraiser crossover I wrote while emotionally processing chronic pain. I feel very mid-status.
Maybe like with standard cannibalism they lose the ability to post after being consumed?
I actually recall recently someone pro llm trying to push that sort of narrative (that itās only already mentally ill people being pushed over the edge by chatGPT)ā¦
Where did I see it⦠oh yes, lesswrong! https://www.lesswrong.com/posts/f86hgR5ShiEj4beyZ/on-chatgpt-psychosis-and-llm-sycophancy
The
callnarrative is coming from inside thehouseforum. Actually, this is even more of a deflection, not even trying to claim they were already on the edge but that the number of delusional people is at the base rate (with no actual stats on rates of psychotic breaks, because on lesswrong vibes are good enough).From Yudās remarks on Xitter:
Well, not with that attitude.
If āwearing masksā really is a skill they need, then they are all susceptible to going insane and hiding it from their coworkers. Really makes you think ā¢.
zoom and enhance
<Kill Bill sirens.gif>
Is g-factor supposed to stand for gene factor?
Itās āgeneral intelligenceā, the eugenicist wet dream of a supposedly quantitative measure of how the better class of humans do brain good.
deleted by creator
A piquant little reminder that Yud himself is, of course, so high-status that he cannot be brainwashed by the machine