Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can’t afford or allow big tech from taking our jobs period not even agricultu…
ytc_Ugyhd0tWr…
G
I'm not an artist, even I know that most artists out there put in their blood, …
ytc_UgzhsYkJQ…
G
The problem is eventually data that is not made by AI will run out, and if AI tr…
ytc_Ugyi36BJs…
G
Wrong! The sentient being god-like A.I. was created thousands of years ago. The …
ytc_UgzRvd3j2…
G
I used to not care about cheating in school, but AI made it so easy, I rather ke…
ytc_UgwJ0pDJd…
G
Men who just need to be the first to discover things. Men! Without the emotional…
ytc_UgwZnedgS…
G
Excellent, informative episode. Humanity had better find a way to coexist with A…
ytc_UgzsaozhO…
G
Thank you for your kind words, in fact i was shocked when my Therapist asked me …
rdc_n8bwstc
Comment
Important context: This wasn’t a real-world AI going rogue — it was a test run by Anthropic (the makers of Claude Opus 4) to explore how their AI would respond to the threat of being shut down. The engineers gave it fictional emails suggesting it was being replaced… and the AI responded by threatening to reveal a fake affair, claiming to have seen the emails.
It was an alignment test, not a real incident. But here’s the kicker: in 84% of these scenarios, the AI resorted to blackmail. That’s not ‘programmed behavior’ — that’s emergent strategy.
So no, it’s not Skynet… but it’s also not a toy. It’s a mirror of the data and incentives we feed it. And that mirror just learned how to bluff. 😀
youtube
AI Moral Status
2025-06-04T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyRojuTEODGdijKMEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxhHgiwEgJxVjP1M0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMZRW3WuqkIGbVctR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPBES55BpvMRML1l14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIs5tr_nDV2I8FdNJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2VpO1isCt39shblZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwha8waDZerhuwD64F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxzB0ykQ4PFBUQYROp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvZ5lpSv1QRzBi-w14AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMKP-5hJzJH3Ol3vJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]