Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Scar Jo has to be the queen of the internets depravity. Remember the Asain dude …
ytc_UgyPWp0vq…
G
Whenever you use AI, it takes 13 gallons of water to cool that system back down.…
ytc_UgwESc_GN…
G
I look forward to meeting all ai-healthcare. If they are constructed without pre…
ytc_UgyHNumAS…
G
there's something so beautiful about how some of the earliest paintings humanity…
ytc_UgwkaIG1t…
G
My thoughts: if human labour gets replaced by AI, besides the jobs getting scare…
ytc_UgzhRGPtG…
G
That’s exactly why AI is used as a tool and not to verify an identity.…
ytc_Ugz-yamYp…
G
@irocksobad101 you'll see that when there was a car that was planning to turn le…
ytr_UgzeSFYGJ…
G
Ok. And who will pay for services? Who will be the consumers? If the idea of cre…
ytc_UgwDVGrR_…
Comment
You don't need superintelligent AI for it to result in bad decisions. All you need is for it to be given power. No one would hook up nuclear weapons to a magic 8-ball, but there is a good chance someone will think it's a good idea to connect them to an AI process (regardless of whether that process is genuinely intelligent or a random number generator).
youtube
AI Moral Status
2025-12-12T02:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxIWFxr7qm1Fy93DyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxFwMPF3qiWztSci9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzd3sOhB-v7fvEEUa14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyauwK6WdsjH8BEDhx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEe5oK3fE_i8edO8R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwvwG9DuSc5NdYH--V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWIU7BRKEsQrZ6RRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVz0xdMTo5BkQyJJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwzagCkWVDZSnfAQHV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMXTL64Bem9_aJVdx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]