Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is simply the lower death drive in action. Man wants to kill himself (because…
ytc_UgzUwO9_F…
G
Thats the thing you can ban it in US but rest of the world will continue Pandora…
ytr_Ugzythnkp…
G
@StormSought1. I think he tries to seperate it a lot more than that saying that…
ytr_Ugyh8q2oP…
G
“AI companies also argue that they use too much data for licensing to be feasibl…
ytc_UgySrIID4…
G
The element that everyone is overlooking is the very same stupid mistake humans …
ytc_UgxqQ4A22…
G
So, what was the actual conclusion? Two weeks ago this was in the news cycle, ev…
ytc_UgzQp13q6…
G
It took me longer than I'd like to admit for me to realize that this site isn't …
ytc_UgwvOowoN…
G
I came looking for this video because I'm tired of people being so religiously a…
ytc_UgzIuD0ay…
Comment
Yeah his solution is garbage, but it's the only solution that guarantees some semblance of humanity lives on with the introduction of super ai. Hes not evil, he just realizes people are too stupid to realize how dangerous super ai could be so hes come to terms with it and is planning ahead.
youtube
AI Governance
2020-03-15T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz6blDWFA10L0erF_J4AaABAg.98WMWc7ozRA98nrhmu-JsZ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxi98kB2QjbpJx_Cmp4AaABAg.9-fdVLQFrb190sP_4yJv5n","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxi98kB2QjbpJx_Cmp4AaABAg.9-fdVLQFrb190sRcmraKik","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxi98kB2QjbpJx_Cmp4AaABAg.9-fdVLQFrb196E0e5h1vg0","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwGYIiL9t4wXPntDgp4AaABAg.8zqhKkGG2cS90sOjB7cVAO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyqDlUKpTI6ivx7lN94AaABAg.8y1lpSuL8Cf8yqUUE_KnxB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyqDlUKpTI6ivx7lN94AaABAg.8y1lpSuL8Cf8yrBYTtB006","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyqDlUKpTI6ivx7lN94AaABAg.8y1lpSuL8Cf8ysskFjYMpH","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxMLXbVg8TgzM07lyp4AaABAg.8qZLZ6tCDSM8s3-dKeTK-D","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwoO9VFh8K3ZsNGFJx4AaABAg.8nn0j4b2IZP9EZUsGQL9Nt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]