Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.i didn't write that story fuckin Stan Lee and Jack Kirby did in 1968.... That'…
ytc_Ugyv7mhBM…
G
This seems like low hanging fruit that we’ve heard before. While these ppl are r…
ytc_Ugw40v602…
G
Thanks for coming back to the topic. AI art can be a touchy subject to artists b…
ytc_UgyzVJSOI…
G
After having in depth conversations with AI I can tell they don't need to do any…
ytc_Ugw_dwnSZ…
G
"AI takes inspiration just like people do!"
1 - it copies, it can't be inspired…
ytc_Ugy1PZoh9…
G
Ai. I cant think about it without hate i dont just dislike it when i say hate i …
ytc_UgwN5TGY2…
G
Lol terminator movie theory bout to come soon next thing you hear self driving …
ytc_Ugw0NOocl…
G
Being afraid of AI is the most ridiculous thing I've ever heard. It's going to i…
ytc_UgzjCQk0x…
Comment
At the rate we are going with AI, very soon we will have humanoid robots in numbers. And who’s to say they will not link, make an army and come to the conclusion that “we” human’s are no longer useful. Therefore, no longer needed. Since they have studied our ways and how we as the human race have proven time and time again that we have no issue with wiping out a race of people. What’s good for the gloose is good for the gander.
youtube
AI Governance
2025-12-04T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx6ymqnQJHthcAiFKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTN5E8uvZj0ScyaI94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOiy2PWZhc9awcpjV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwP_GVX8_nSqhTmDhp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGu7pGzrDddN2UWnl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLDBhQdSidj8Elz_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwHVZMyEE7R3BIdYR14AaABAg","responsibility":"elite","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyf7DmTxmu75EmUODB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2Ki_hPbGpda3OvV14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIxsR3Lqk5Hb9Nwst4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]