Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
pentagon so stupid. what if, when they give claude firing control, and they give…
rdc_o79td8x
G
I don't think an attorney should ethically take this case. Under US law, the cod…
ytc_UgznJkP7h…
G
Don't make me laugh about A.I. solving even global warming and child suicide. It…
ytc_UgxLDZgh5…
G
Ai art generators are for those who just want to make something but they don't h…
ytc_Ugz2WK_4a…
G
Thanks for the detailed comment though I disagree with a few of your points. But…
ytr_UgytH4OWu…
G
I think once you get past Cosmic AI, maybe the term “artificial” won’t make sens…
ytc_Ugw9jcD93…
G
Hehehheh Nike cortez sz 12.5 maens 15 pairs black white grey white blue white re…
ytc_UgzEBeeTv…
G
Unless something incredibly groundbreaking happens in computer processing power,…
ytr_UgyTq7Jue…
Comment
The actual truth is that 99% of the people on Earth won't care 1 bit about AI. They'll go on living their lives as usual. it won't matter if decisions are made by AI or the Government, they'll always be bad for most people. Only the very wealthy (the 1%) will care. They'll interact with AI and decide if what it says will be implemented. No different than the "yes men" they already have around them.
youtube
2025-11-01T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwTZiz9CyNlyqA0_nl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDSYsK8gkHNKjSE4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgyX6jLs1OPRLmuA14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyhlh8yHfT86d9vGLt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzjZYXOCUWqMzvULUB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwkzwU29zQ77g2NF394AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwglN7tJea5D9248-p4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuJEfu0E43vsuDdJB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyawu_6Vmov46_urMZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNrRev4pEHlDIWHjh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]