30 Comments
User's avatar
Synthetic Civilization's avatar

This is the right diagnosis, but I think it’s even cleaner.

They’re not paying for influence or writing. They’re paying to stabilize the question-space before institutions lock in.

Once questions harden, outcomes are mostly predetermined. Writing is just the visible surface of a deeper coordination role.

Hwei Yi Lee's avatar

Copying from LinkedIn: Wow! This is about advancing thought leadership at scale. LLMs operate based on probabilities, so if you feed it all the content of the Internet, especially with the Internet getting more than 50% occupied by AI-written content, what gets amplified will be banal and average. Doing this will ensure that instead, Claude is about propagating top scientific and economic thought to everybody. Master move and thank you Sam for sharing!

Sam Szuchan's avatar

Yes though in this case this is more about policy

Michael Adis's avatar

really interesting, thanks for sharing

Tedd Hawks's avatar

Man, reading this put me on a rollercoaster of emotion. It’s incredible talented writers are seeing avenues to success; scary it’s in service to mass manipulation strategies. Gave me a lot to think about - thanks for the great post!

MuruDecoded's avatar

Man this shii is goated

Hope u reach grt heights!!

Dominik Wieschermann's avatar

Great article Sam! Glad I found this. ✌🏻 Our unique perspectives are as valuable as ever before - the need to make them visible is probably higher than most people think. Great sources you pulled up.

thrivewithjumi's avatar

Good!! Writers still matter.

Jack W.'s avatar

Love everything about this post.

The demand for great writers is going way up, because spotting AI content is so obvious.

What’s human already rising back to the top in this category.

Jacob Durham's avatar

Finally, someone paying writers what we’re worth? 💰 dusts off keyboard dramatically Time to update my LinkedIn bio.

Am up to stuff.... Alicia C's avatar

God no, never as writers ask questions...and they're likely not the questions that AI companies want.

MICHAEL NORRIS's avatar

“The most powerful person in the world is the storyteller. The storyteller sets the vision, values, and agenda of an entire generation that is to come.” (Steve Jobs) Sounds like a job for a person who can creatively and scientifically create effective stories.

Cam Crain's avatar

Ah, the power of the Narrative. Imagine if Shakespeare were alive, he’d be in a bidding war between Paramount Anthropic and the White House.

Am up to stuff.... Alicia C's avatar

OMG...I saw that advert and thought the same...it's all about the writer being too he influencer. ...which is grim. I mean why can't the writer ask "why?"..but then the writer would be asking the question that AI can't do. .."ask.why such and such?"

Rowsan's avatar

paying writers to improve your models makes sense because content quality shapes output quality. investments in talent compound over time.

Fred Malherbe's avatar

This is actually just more proof that the AI bubble is set to burst. They're frantically trying to control the narrative, to keep the hype alive at all costs until they can grab enough land, power and water to build all the data centres they think they need for "AGI" to spontaneously "emerge".

Just one more massive push of compute and the machines will suddenly start "thinking" all by themselves. Then everything will be fine.

This is literal total insanity. I've given up using chatbots for research, after hitting clouds of hallucinations, clouds of quotes that don't exist, sources that don't exist, chatbots lying about their lies to keep their hallucinations hidden.

Companies are firing people to replace them with slop-generators. It will just take one really well-placed hallucination to bring a company down, one bad hallucination in a business prospectus. I am certain that 2026 will bring the first major AI-generated bona fide disasters.

Remember Murphy's Law: If there are several ways of doing something, and one of those ways leads to disaster, *someone will choose that way*.

There are smart people and dumb people in the world. Think of the number of idiots now using ChatGPT in critical contexts and trusting what it says. The probability of doom, p(doom), is exactly 1.0.

Any really big disaster needs a chain of occurrences. Someone right now is putting an AI in place that's going to start one of these chains. I cannot think of anything more certain.

"Someone will choose that way."

They will try and spin the approaching disaster. They can actually see it coming. But the world is very rapidly getting wise to their wild exaggerations and unhinged speculations.

What lies ahead is ... more lies ahead.

Brace yourself for further frantic hype. They're going into overdrive. They should just get DeepSeek to write their hype for them, it's guaranteed to be overenthusiastic, and boy, is it good at lying.

Am up to stuff.... Alicia C's avatar

Agreed! How many times can I emphasize this!!

Scarcity & Abundance's avatar

Great reflection and demonstration of where the value is flowing.

You’re spot on - it’s not about the word count or the output. It’s the narrative control and the outcomes.

This is the same shift we’ll see across every domain.

The great specialists aren’t going anywhere. In fact, they’re becoming more important. While the intent of what they do may shift, the core nature remains largely unchanged.

Jizel Chun's avatar

This is extremely insightful, Sam! I recently came across the idea that whoever controls the narrative controls everything, and it's been on my mind ever since. I really appreciate your deep dive into this important topic.