Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot: "What is my purpose?"
Rick: "Pass the butter!"
*robot passes butter*
Rick…
ytc_UgiVIEHUH…
G
Have the robot shoot up in the air like a mad man, oh and people worried about t…
ytc_UgwQ3MHk4…
G
I think that lawyers being substituted by AI is a good thing. AI will give a fas…
ytc_Ugw9Z0Oxn…
G
Driverless trucks will be safer than human truck drivers and can operate 24/7/36…
ytc_UgxRamGeB…
G
@mysticalword8364 we are not talking about a specific piece of ai imagery tho, i…
ytr_UgyuXdxH7…
G
Consciousness appears in extelligence. Fancy term for interaction. In famous shr…
ytc_UgxApOgqk…
G
Can we get back to what we were before AI revolution? Simple answer is no becaus…
ytc_UgwHHHAAC…
G
I’m yet to find a good argument against “why shouldn’t super intelligent AI repl…
ytc_UgzawtYWl…
Comment
When I think of something autonomous, my thoughts become very rigid and Elon’s actions don’t predict good outcomes. If profits are getting maximized then dropping anything that produces less than average could be a path forward, don’t be unproductive—the unproductive get eliminated lethally
youtube
AI Governance
2025-06-21T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzEkKp7cQExTz4ahJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtUEsaBQpjMzxHqct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDb1ghPHSVMrYQN8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxh4WS9jv0h5txEXhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxbFiHDR_fiAfd5FHR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxNOZOdbQO58gx-BiV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7XWzxioMkkkseryp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKpIlqznyPNt390wV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCdXA9NjQcwmt8YwV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMu1vPlPFuKpIHV154AaABAg","responsibility":"government","reasoning":"unclear","policy":"ban","emotion":"approval"}
]