Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is already happening and kinda scary when you think about it. people used t…
rdc_oh2d17v
G
When things go south with AI . Humans will unite and start destroying those supe…
ytc_UgxdJMDlh…
G
Why do I have a weird feeling the movie i-robot will play out in real life? As t…
ytc_Ugy9l-9jb…
G
yes chat GPT-5 already released and people pridict AGI will come out in 2026, so…
ytr_UgxI9lNHo…
G
The same things were said about the industrial revolution. Employment exploded. …
ytc_UgwJr1YGH…
G
AI has convinced me that the human soul exists, because it has shown us what art…
ytc_UgxJZoqEb…
G
I’m not talking about those chump change offers. I’m talking about Zuckerberg of…
rdc_mz05l4u
G
Once ai starts to self improve and redesign itself to ensure self improvement it…
ytc_Ugx_ZVo9C…
Comment
Where is the proof that self driving cars will kill fewer people than human drivers? Everyone keeps stating that but there's no evidence for it at all. I'm sick of the way tech people state hallucinations as reality. No wonder their products do the same thing.
youtube
AI Governance
2025-12-30T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyJFVVelDJhmLDxFh94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmp-0x-b8UATC3pip4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxptfr3QJqB7I70Sd94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy01oxlhz_GwERYiNV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7I7PMt7BuhYjzbqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz3ZjS2k3Oa0_c0A94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDPJNtVs5vhenhuKJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOsLWV6wJxu7BXqWF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUGGWkVtBuoXaTRGF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhB3vD6kGYBqNIMkV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]