Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI sees no purpose for humans.....
yet AI wants something to kill.
Ahhh...the a…
ytc_Ugzpqw-Mk…
G
Just from the title, didn’t watch the video, worried about AI suffering. Clearly…
ytc_UgxaV2cXd…
G
Hell I suck at art so bad, but tbh are Ai "Artist" really artist at all when the…
ytc_UgyduqeDh…
G
Ah the good ol' 30th Jan version gave me the following quote when I asked to wri…
ytc_UgwK2ddga…
G
Bernie, even if we keep the working class and not let Ai wipe us out, China’s ai…
ytc_UgyflDOL3…
G
This still isn’t a great take. Yes, AI can code. Yes, it can automate some simpl…
rdc_mxy8u6r
G
@L2-Finale Regardless, the point of that part of the video was about the AI bein…
ytr_UgxpHqg1Z…
G
8:40 well thats just depressing, half of these completely destroy his original s…
ytc_UgyVeDtVa…
Comment
WALL·E
The film seems to warn us about the dangers of an excessive dependence on technology, even those that are considered beneficial. The film shows us the future where technological advances originally designed to make life more comfortable and efficient reach a point of excessive dominance. As we can see, humans are an extreme example of this reality. They have exchanged their autonomy for automation, resulting in both physical and mental atrophy. Their individual decisions have lost importance and critical thinking has become a memory.
youtube
AI Governance
2025-09-12T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxgsa-aDi3R5NxLAEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfUEJXyVUuTSf00Ol4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy71SvJoYSVhiOty-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwim4mwfbHcMedPKTx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyD2QqEg5Rn9OSTN4t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzceo7t__UoSHqXId4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFJND1_r0PVJUeIKh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYcBauVrdLgN4qvMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzph7RJ7dTLp-6dt614AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRRZhUPqeX0q2Vb5t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]