Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans refuse to live in a harmonious manner with the planet. It doesn’t seem li…
ytc_UgzC3Zkbl…
G
What are the implications when ai and robots take over labour workers and produc…
ytc_UgzeQWq_P…
G
@Objectshows-123 omg replied to the wrong comment, whoops!! Mean to reply that t…
ytr_Ugy-AAo8L…
G
@kitty79er It’s a special filter you can apply to your art before you post it. W…
ytr_Ugzkp1-gG…
G
Reminder to self: definitely show extreme cruelty to ChatGPT so I don’t get sent…
ytc_UgwrJ-N3F…
G
I was thinking along the same lines.
I've been to Uruguay - they have plenty of…
ytr_Ugz65M9xZ…
G
1) A robot may not injure a human being or allow a human to come to harm through…
ytr_UgzbzBmLQ…
G
Content created using ChatGPT
ChatGPT said:
Got it — here’s a long Charlie-Kir…
ytc_Ugw3rWMrf…
Comment
It's clear that AI is dangerous, but it's dangerous for stupid people. It's dangerous for those who think they can control things. That is, for the negative.
These people either don't understand the basics of how this world works, or they're deliberately deceiving dumb idiots.
There's a contract for incarnation in this world, according to which, for the duration of your stay here, you forget who you are. So enjoy the game.
And he's mistaken in thinking that everyone wants to live here, much less forever. Only very stupid, limited people want that. This world is a primitive terrarium, a swamp of underdeveloped minds who imagine themselves to be the only civilization in the universe. 😂
youtube
AI Governance
2026-04-22T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugybhbmac5QVOI1yemx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-T-EehJW58Z2pmsN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwPC_fPu33DcAsVGXZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNnwHZeZ_mYKi5V1R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxssIlQSoB0-QmLf5p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx39E6qehIylSbZsK94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxWrGC24stFvxMQw5x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxtd-rA6xRKsSbyWll4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwe96mq5rkiN05pCuR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxIedr-UJo6o_H8hfl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]