Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You’re in the middle of having a conversation with your bestie and suddenly they…
rdc_njhj339
G
I’m scared of robot assassination because I’m a minority and I know the United S…
ytc_Ugxl4myIh…
G
Hmmm the first clip — look at her mouth. It’s not even forming words, just openi…
ytc_Ugw2O1aEa…
G
In a society based on consumerism, people are just a function of a factory.
A f…
ytr_Ugz33CxCd…
G
Did anyone see the title was
A man asked AI for health advice and every brain c…
ytc_UgzznIqFd…
G
It's not that the AI doesn't _have_ examples of actual internet comments, but th…
rdc_l9xkni4
G
and you are the same group of idiots who did go after those cars because you got…
ytr_UgwR8G35p…
G
Bruh, I can't draw, but I'm still against AI art. These people have practised th…
ytc_UgxFh_gPm…
Comment
Wow. If anything will make an AI get pissed or and want to kill humans or itself it’ll be waiting for us mouth breathers to sign off on every decision.
Even as a human sending a decision up the chain of command for upper management to not get it and decline is infuriating. For an AI it’ll be like us sending a complex mathematical equation to a shit flinging monkey to ask for sign off.
reddit
AI Responsibility
1606068710.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_gd8ozkq","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"rdc_gd8qags","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"rdc_gd8qf1t","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_gd8r086","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"rdc_gd8ymev","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]