Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@theyoungturks I have never disagreed with you all so many times in one week. Th…
ytc_Ugx9bcUTI…
G
Mr Robot here is quite scary. He is honestly telling his intentions. Just sca…
ytc_Ugw1ZPkoe…
G
@tomykong2915 yeah, helpful ai that stole from artists, how do you think ai can …
ytr_Ugz4pLXGJ…
G
Just because we're able to develop AI doesn't mean we should do it immediately. …
ytc_UgzpZj0Xg…
G
I think out of all the arguments AI bros say, the "its more accessible" is the o…
ytc_Ugxxvusfb…
G
Did he just answer "I don't know" to the question, "Does AI have consciousness? …
ytc_UgzpqbRBH…
G
Same MO... the blame for what happens next is projected on the solution they've …
ytc_UgyyyIRBq…
G
So nice to see the woke idiots face the music. Àn AI writing movies and TV would…
ytc_UgziJDSIr…
Comment
The 3 laws of robotics 1. a robot must never harm a human. 2. a robot must never let harm come to a human. 3. A robot must do its best to protect itself unless it contradicts the first 2 laws. -I. Asimov. I ROBOT
youtube
AI Harm Incident
2025-11-20T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxmU8XHi7U6HBdNzjR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwuJXQ-SQPnpD-3xWV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRW3NITWdL6hw64yF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzh8Fk3whU-4SZdH094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzCqUr_0yyG1Cu-E0N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUtPHHoCnVVAZVg3N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHBNzLJTCyAYTw1IN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoYSSZt_cw1mRIX0J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAMEm4WRd6l4krN8J4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3Sw7lM87MnMjMVvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]