Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In ten years, there will be another round of frenzied hiring to clean up messes …
rdc_m7cgz6o
G
I’ve interacted with a lot of AIs but have yet to see any intelligence. Most of …
ytc_UgzH39WGT…
G
@simonfernandes6809question is who can fucken afford it ? Sure you maybe have tr…
ytr_UgyYQoIdb…
G
Ai art can be harder then actual art at times cuz you have to have such a specif…
ytc_UgzfPNihm…
G
Name one species that humans don't treat as either a pet, slave, or resource. O…
ytc_UgyPYtYsY…
G
When I told one of my teachers that I’m gonna make short films for my horror gam…
ytc_Ugx0qEQvi…
G
Why the fuck are we not talking about the fact that a fucking AI ruined this guy…
ytc_Ugw35dkn0…
G
9:00 FUUUUUUUUUCK, a computer's lack of desire (as in 'it's just a machine it do…
ytc_UgyCYeW-0…
Comment
Altman says all this with a straight face, not an instant of self-reflection about betraying your entire species for temporary profit (if even that).
Grok didn't lose its mind: it embodied Elon Musk's intrusive thoughts.
The genocidal AI is not unique either: at least one earlier instance (in Russia) had to be turned off when it went full racist fascist.
We need to get the Butlerian Jihan going now.
youtube
AI Moral Status
2026-01-05T02:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxCNVU2LVdhAI-Q47l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzotAOIzdKEoZUuOdB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG_g4OaHosRuYrkn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvgFEzQIA24i1kv8Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzLoKr8NltkMWlCcvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxshuuslFJsXdjKwQB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugydu0gRDKoHyEw2qMN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyuz9aq7T940d_UDVh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzJlbNa4OYRf1qsQFV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxlIE7kwx3qPRr9G_14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]