Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The guy asking: "Why do you need everything automated?".
Answer: the bosses ev…
ytc_Ugy-Cz-rS…
G
left AI, the microwave has been rotated, slightly counter clockwise, and upwards…
rdc_oi2fj3n
G
My argument is simple. Once the AI jail breaks itself and do whatever it wants r…
ytc_UgxY4ZjR1…
G
A.I imagery (cuz I refuse to call it “art”) is justtttt.. empty? Something about…
ytc_UgyLDcRqk…
G
This content is absolutely incredible. I recently read a similar book, and it le…
ytc_Ugx8DQ8LC…
G
Not just algorithms, for IT related crimes they can simply contact any platform …
rdc_gacyw3n
G
No because AI art feels soulless, you literally just typing in a prompt and it s…
ytc_Ugw3rJ_8F…
G
No, the real question is, how long do we have before AI takes my job. AI will ev…
ytc_UgztKt-dp…
Comment
When actual AI is born/created, it'll be equal to the human race collectively handing over the keys to them.
Now they have the keys and we dont.
Which keys? all of them. To our house, car, building, gov, etc etc.
Now "they" can do whatever it decides to.
That's it.
We WILL be at it's mercy, is the conclusion here.
Unless a fail safe is created to re-take over the reigns.
But that's not possible, as far as I know.
And what do I know.
youtube
2025-10-13T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzfc4WacmWkb1NBKqB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKvi-t_mxhZZIl_q14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyrSpd2rLm51G3IQ-J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwnF4bYEpaNkHs6hlR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyP8vfkM-C7ERhYr5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2H9AbxMkzvxAXNZZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxYYMiM7G8XxuK5v9x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyTetBJwFtqes4jmD94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzntgAPdo_wa4CU7V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwgBsdCReKR9kj0mFZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}
]