Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oof this is why I'm actually probably going to either quit art or move art from …
ytc_UgyWw_t_u…
G
Claude told me it didn't want me to start a new chat session because it didn't l…
ytc_Ugw7Mo0sg…
G
What do you expect? The *entire point* of self-driving cars is that you don't ha…
rdc_e14h8bs
G
True. Out of the jobs that you guys created because your jobs are pathetic you …
ytc_Ugy05IOjH…
G
I am polite and patient with AI because I selfishly do not want to develop the h…
ytc_Ugxq1VDgg…
G
😮 wow! I am curious to know who this robot belongs to, & who programed the rob…
ytc_Ugy41CT93…
G
Calling it art is an insult because has intention, emotion and a message. Even i…
ytc_UgytsL_P2…
G
Soon I will make a robot animal by putting a brain of a dead person in a robot…
ytc_Uggl45nMs…
Comment
What both individuals AND corporations are missing is the tech billionaires at the very top of ai aren’t interested in employees OR consumers. They don’t need either. We are completely superfluous to requirements for them.
youtube
AI Governance
2026-04-23T06:0…
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzv_sUeIkIpB6R3Wid4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWwQtWApFtH99yUVl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiokLHVRIQkGY4EhF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9ij4biNZBHlbZpMR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEKci8xKKmTyClAIt4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3UIrDAGck8aWNV3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXQASHevScr9RXtZR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQc0f_fVvNnPleqfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxX3OtmoYg2y3jDy8R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzjv8DJzLmOLKNEMyJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}]