Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And one day the AI came across this video and got the idea that humans were an i…
ytc_Ugz8g3czY…
G
Can people stop making comments like this? As long as there are still dumb peopl…
ytr_Ugz9dfrxc…
G
... or HAL in 2001 - or the smart bomb in Dark Star... you can actually talk to …
ytr_UgxUuMTpe…
G
This was easy to predict. But the people who pushed AI told us, that it wouldn't…
ytc_UgzEUDc9_…
G
It’s wild how this whole discussion nails the deeper problem: AI isn’t replacing…
ytc_UgxdHrYPT…
G
I am an artist and I support AI mainly bc I just use it as a refrence for charac…
ytc_UgwxzyE67…
G
Wasn't the whole point of the various iRobots (original Asimov story and the Wil…
rdc_cq6mjkw
G
We need to stop legitimizing these people by calling them "AI Artists" and start…
ytc_UgxwzOGmy…
Comment
The thing I love about ai is its logic, I’m a very logical person and I like to know about all the possible options and opinions and make a decision based on logic. I defo don’t want it to destroy me though😂
youtube
AI Governance
2025-06-17T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwAiGC1TXKVxyNvxGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbcyXm0zYItadye_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRI6Y6WkAi_70Q1nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHwjplJBxe6H_PSwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxox7PzZIeaD6kmIRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1Ha24gD6NZVGjrOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWSDQMfUL7Ckyb9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBWnjvOeu5pxzdg894AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx43vFV8_0-3AfjulF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwe9SEfMvZAxxSVDxx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]