I have spent the last half-hour in the angry dome
I have spent the last half-hour in the angry dome
“Raw, intellectual horsepower” means fucking an intellectual horse without a condom.
Oh, wait, that’s rawdogging intellectual horsepower, my mistake.
So, the Wikipedia article about “prompt engineering” is pretty terrible. First source: OpenAI. Second: a blog. Third: OpenAI. Fourth: OpenAI’s blog. ArXiv, arXiv, arXiv… 43 times. Hop on over to the Talk page, and we find this gem:
It is sometimes necessary to make assumptions to write an article (see WP:MNA).
Spoiler alert: that link doesn’t justify anything. It basically advises against going off on tangents: There’s no need to rehash the fact that evolution is a fact on every damn biology page. It does not say that Wikipedia should have an article on some creationist fantasy, like baraminology or flood geology, based entirely on creationist screeds that all cite each other.
Underlying original post: a Twitter bluecheck says,
Sometimes in the process of writing a good enough prompt for ChatGPT, I end up solving my own problem, without even needing to submit it.
Matt Novak on Bluesky screenshots this and comments,
AI folks have now discovered “thinking”
No worries
If you can’t get through two short paragraphs without equating Stalinism and “social justice”, you may be a cockwomble.
Welp, time to start the thread with fresh Awful for everyone to regret:
r/phenotypes
Here’s a start:
Given their enormous environmental cost and their foundation upon exploited labor, justifying the use of Large Generative AI Models in telecommunications is an uphill task. Since their output is, in the technical sense of the term, bullshit, climbing that hill has no merit.
Man, now I’m bummed that I don’t have a cult trying to distribute translations of my Daria fic in which Jane becomes Hell Priest of the Cenobites.
I think it could be very valuable to alignment-pill these people.
Zoom and enhance!
alignment-pill
The inability to hear what their own words sound like is terminal. At this stage, we can only provide palliative care, i.e., shoving into lockers.
[Fiction] [Comic] Effective Altruism and Rationality meet at a Secular Solstice afterparty
When the very first thing you say about a character is that they “have money in crypto”, you may already be doing it wrong
“The Publisher of the Journal “Nature” Is Emailing Authors of Scientific Papers, Offering to Sell Them AI Summaries of Their Own Work”, by Maggie Harrison Dupré at Futurism:
Springer Nature, the stalwart publisher of scientific journals including the prestigious Nature as well as the nearly 200-year-old magazine Scientific American, is approaching the authors of papers in its journals with AI-generated “Media Kits” to summarize and promote their research.
In an email to journal authors obtained by Futurism, Springer told the scientists that its AI tool will “maximize the impact” of their research, saying the $49 package will return “high-quality” outputs for marketing and communication purposes. The publisher’s sell for the package hinges on the argument that boiling down complex, jargon-laden research into digestible soundbites for press releases and social media copy can be difficult and time-consuming — making it, Springer asserts, a task worth automating.
internally at Meta:
-trans and nonbinary themes stripped from Messenger
-enforcement policy now allows for the denial of trans people’s existence
-tampons removed from men’s restrooms
-DEI programs shuttered
-Kaplan briefed top conservative influencers the night before policy changes were announced
My favorite quote from flipping through LessWrong to find something passingly entertaining:
You only multiply the SAT z-score by 0.8 if you’re selecting people on high SAT score and estimating the IQ of that subpopulation, making a correction for regressional Goodhart. Rationalists are more likely selected for high g which causes both SAT and IQ
(From the comments for “The average rationalist IQ is about 122”.)
Saying that Excel is not and never was a good solution for any problem feels like a rather blinkered, programmer-brained technique.
Ah, here’s the post I was thinking of; I missed it somehow.
shot:
chaser: