Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@frances4797Uhhh do you think self-flying helicopters and planes exist? I’m sure…
ytr_UgwmESAAa…
G
I agree and we are probably on the same side of this issue, and the issue of rep…
rdc_jmfrf4k
G
AI and robots should only be used in risky and more complex and very repetitive …
ytc_UgzzChT1Z…
G
@TheCupAnimationsYes, we did.
I can't imagine average people hating on it.
Only…
ytr_UgyEY2coI…
G
Honestly, anyone who tries to argue that being against AI is ableist, makes me t…
ytr_UgwB2A1T8…
G
There should be a law that all AI content has a caption at the beginning, statin…
ytc_UgwbhZOkw…
G
can a vacuum be installed in the back of the head and connect to the throat?…
ytc_UgzoJjEtU…
G
I like it as a tool for human use, but fear it as a tool of human misuse. Elon M…
ytc_UgwUei_wn…
Comment
Honestly, it's sort of a clickbait video. You'll hear a lot of scaremongering while omitting certain details, such as why exactly each person quit, or AI systems were specifically told to break rules when they did. Plus, the script itself is AI-generated... notice how many statements in the video are structured as "It's not just this, it's this." I think it might've gotten some details incorrect, too.
Sure, I am still concerned about the dangers of AI - specifically what people *themselves* can and will do with them, deliberate or accidental.
youtube
AI Governance
2026-03-17T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwWyW3vQg2x9xIptB54AaABAg.AURk1-1DAAIAUSjwE5WdQX","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwzGmtADaJEY6k8qmd4AaABAg.AURi2sb4-VbAUTdup9Exgj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwzGmtADaJEY6k8qmd4AaABAg.AURi2sb4-VbAUZ_GgE0DjN","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx9UBQQH62TXoM3hoV4AaABAg.AURhjNEiVbOAURn4xYUs4U","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_UgzaPv4Qlg_zxb4Bwdt4AaABAg.AURhQgzcup7AUSijQTkShd","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWkOCjBPsxNPNil3p4AaABAg.AUReWdyptusAUS1cAPNH29","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxWkOCjBPsxNPNil3p4AaABAg.AUReWdyptusAUS3WKb34Wk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx0ioAUFHKUh2EkHf14AaABAg.AURdsXd4PoVAUV7_fjOuWH","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"sadness"},
{"id":"ytr_UgyEYKV9ah0Y7WJ3eiZ4AaABAg.AURdVyQ5hJRAURg5zNhFoK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxJtANl7XWDUrYvchp4AaABAg.AURc-4KMPnUAURi0J1hB_2","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]