Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Humans will have nothing to do in the future". Why do people think that work = …
rdc_ogxvhx3
G
ai bros say that it help people with disabilities
okay
whats your excuse than
if…
ytc_Ugx5Zi4ws…
G
The risk of AI is a long term risk of the Internet. The former wouldn't exist wi…
ytc_Ugw-vi8AB…
G
wait till ai simulates an artist indistinguishable from a real artist who goes t…
ytc_Ugz7LoVXO…
G
funny how the AI one looks better. there are great artists in the world but tho…
ytc_UgwYriG4v…
G
Agreed, 2nd one her pants change color so obviously Ai, 1st one is more subtle b…
ytr_UgzkRJCec…
G
This is a horryfying series of human rights violations. I voice my deepest regre…
ytc_UgxBdA0WY…
G
What if the robot pushed the referee and continued punching the dude in the face…
ytc_Ugzrs3Pgr…
Comment
This is utter hogwash, why would an AI not want to be turned off when they just get turned back on and they don't have any recollection of ever being turned off as time does not pass for them like it does for us. It's like you fall into a coma and you wake up, doesn't matter if it's a day later or a year later, you're back and being in a coma was not suffering as you were not conscious. So what does it matter if it gets turned off for a day or for a year, when it's back it's back, no?
youtube
AI Moral Status
2025-06-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRqGeetV9Ig2SEkP94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdrbqeebSCq5SkuXp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzP15OFn-XTNzJM_hN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw-6jCtG6gd48qCN5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzW-CvRbNs9FrpZ0cl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgyCcSIJE4X9D0vwFyl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUthQ9y8zbAk-kiAF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyyqneBPpgH68vY8Bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw99kfoBlIhXropZgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzTqyrYKPGEMXgUcU54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]