Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its silly. Imagine if a theology prof started expounding on ai? Thats how he sou…
ytr_UgyTsBk3P…
G
clearly the first clip is AI. Gerard never answers his phone for applicants. the…
ytc_UgwA_LMyf…
G
on highways I trust AI driving more than I trust a human driving.
it is funny ho…
ytc_UgxDfGbcZ…
G
This says more about how bad and unclear your roads are. If more road markings w…
ytc_Ugyc_Z4mb…
G
My biggest fear is the fact that people won’t want to commission me anymore if t…
ytc_UgzOBgp3V…
G
not really, ai art is boring because you cant get that feeling of seeing a drawi…
ytr_Ugzz9P2TL…
G
Just did some research and it seems like the best low-tech way to disarm a human…
ytc_UgzceTtb6…
G
This reminds me of the mcplant or some vegetarian version of a chicken nugget. L…
ytc_UgwGYlMB1…
Comment
AI will end up destroying just about everything including most of common humanity is not that AI, not because AI is inherently dangerous, it is because the corporations that will have control over it are EVIL beyond rational imagination and the governmental insitions that are supposed to protect us from this evil did not in the past, do not now, and have no intention of doing so in the future. Earnestly presented words and silly charters do not protect anyone from anything, death sentences sometimes do. I give this World 20 years at the most.
youtube
AI Governance
2023-06-15T02:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx7A1WJhXcBxK0H-Hl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzx1ZltpGKD-qhgsKB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-v6oTgnPe0dLfju94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzt_BP38llCmOZwCqJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1RgzcY0ReESwe4HZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPj-Bw17_ZInSgi1p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxZ7bcXM0YC8c047T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxooo6914TaV6rnLG54AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZGOVLRa1Jeuvip1h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzrLtnrjdlYIArVdH14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]