Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The extremely extremely, how you say,..." Casually dressed"( puting it mildly) h…
ytc_UgzYiIVSZ…
G
oscar should win the oscar for acting like AI art has a soul
oh yeah, the fact …
ytc_Ugxujm_Hb…
G
Very upsetting video. We should not afraid that we don't understand, e.g. comput…
ytc_UgzJN32Bf…
G
Was gonna say: "Trains are safer. Why we talking about self-driving cars being s…
ytc_Ugy17g62s…
G
Sorry but AI told that people like you will be fooooked by 2035-2040. All this A…
ytc_Ugxdqw-ZV…
G
It does scan for that. I saw someone talking about this online so grain of salt:…
rdc_ohzs3zz
G
It seems to me that in the preview you needed to depict the AI of the "artist", …
ytc_UgyVpkFH9…
G
***VERY IMPORTANT***
THIS IS A FICTIONAL VIDEO
If you go to the link at the sta…
ytc_UgyvbRX5H…
Comment
If robots had feeling or thoughts it would be the end of the world. They would either turn on us or accidentally try something they are curious about. Giving a robot curiosity is like building a wooden fireplace and lighting it.
youtube
AI Moral Status
2021-04-27T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzY9MQI6DX_CCOFXb94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzKJIEuhxY-PXo3Kkx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxhqe0SG50DCpK3xo94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwG83mfjMxmSl1KXVR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgCs0eAsqI8ZYDNHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuyJDKroN6U71Mni54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxfdAbhMkl7VF7nId14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx25490C9O0istXlBx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzNlya-f1wMwjftKcp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw1GgJxNzYvlH5bWJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]