Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude this conversation makes no sense. Chatgpt generates answers that flow from …
ytc_UgyDZsMKF…
G
I don't get it, is the video an hypothetical of technology in the future or aski…
ytc_Ugy2qq0mb…
G
nah that ai pic was juat as awesome. I fucking hate people who sniff their own a…
ytc_UgwtJ6DuW…
G
I met Ameca at the museum of the future in Nürnberg Germany! German premiere in…
ytc_Ugx48htnM…
G
The difference is that digital art is a medium that still requires some degree o…
ytc_UgwYKGA-0…
G
There is just nothing inspiring about it. There is no idea and no heart. An arti…
ytc_UgzJkiskP…
G
What would be the purpose of the AI seeming nervous about the questions being as…
ytc_UgxzAnSEL…
G
if ai reaches that point creatives still win because you need someone creative t…
ytc_Ugw6IF5KE…
Comment
Obviously the answer to all this is 42.
The problem is we don't know how to properly instruct it. Too few directives sends it off in its own direction. Too many puts it in a conflict. That is where we need to be focusing our study on not worrying about an AI becoming evil or malevolent.
youtube
AI Governance
2026-03-30T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxNngpzvlmoRsOWrnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwLMkER-lkYjrrp7yF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNG4cAC3mORElpNj54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwED9KricIAGGO1qDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugzn4WCcvlKv4jpjab54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmEyWsObVRtGMM0bx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlccIz6vK7aFrNf3J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxjdl0NlaaLlMwkfV14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIVvK1exYgvem-Vpt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6PQjYL3RWe12Qex94AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]