Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Looks like we've been the "futuristic AI" this entire time. Gonna add that to my…
ytc_UgyjJiynp…
G
It’s wild hearing them list off the predictions—2026, 2030, 2035. The timeline k…
ytc_Ugxqiq20C…
G
Imagine robot bodyguards with these perfect skills. People will be afraid to bre…
ytr_UgyQRg0Lh…
G
"in a true emergency it’s a human, thousands of miles away, who is expected to …
ytc_UgxFWlux5…
G
He had a "racist email from 1990s" controversy happening December 2022, so he is…
ytr_UgzooVc6G…
G
The "AI artstyle" I see that's sort of in all AI art I personally just don't lik…
ytc_Ugz2Bv6PL…
G
It is just a robot .Id you want to have a date with iron .....
Just stupid .
Th…
ytc_UgxaEoSzS…
G
This is a cult, even Korean Christians are against it. Let's not start calling t…
rdc_fjzk9jq
Comment
Oh look AI is doing a lot of shitty stuff, oh no what can we do......well we can do this thing called turning off the computer/main frame. Click Done, problem solved. Then there is this thing called Delete.....but hey. If the AI goes into a robot, then we perhaps have a slight problem, so we should add off switches to all robots that are easily accessible to anyone with a finger or a long stick. There Terminator stopped.
youtube
AI Governance
2025-12-07T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzO_OxP0uRdWvJWJqp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7CKNTHG9DSQ4hUMJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEFJshnm9bbB1PYcR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2g-LwjwZs40XplTN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNdKA1BjjtfPOuo_x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzX4glQxdUoxy43JBt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwaoS_eIRF853xxITV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_3Arb4u3yo9JY5et4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_Im8ZKgt2WH4UFqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhYo-hutUD3kMXa6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]