Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Guys, the AI doesn't NEED anything. It feels no pleasure or pain, hunger, cold a…
ytc_UgzpzfzYQ…
G
I've said this before and I'll say it again AI IS A TOOL not a toy. I used ChatG…
ytc_UgwGh7pN4…
G
Some regulation would be nice but i feel like tech companies dont care about tha…
ytc_Ugyc98uyu…
G
I really agree totally with Elon, we need to get off this planet because these r…
ytc_Ugzs8Hatv…
G
If AI can eventually alter not only itself but also the universe we live in - wh…
ytc_Ugxr2yCpk…
G
It is going to be very interesting and dangerous if we reach a point where AI re…
ytc_UgwuHdKE3…
G
@susanq6398but then what is the purpose for ai to produce anything if no one can…
ytr_UgxtdDIwm…
G
I can’t believe you had the Black man’s story in the middle of the list, he is e…
ytc_UgzcgKn8O…
Comment
It's so interesting how we human are obsessed with creating things that could easily cause our extinction. At some point robots will decide if our behavior is acceptable or not (programmed through empathy) and just decide if killing us or some other type of punishment. Just remember that humans are full of imperfections, why would a greater mind like a robot will obey us. Just give it 100 years, AI will be much more intelligent and powerful than us.
youtube
AI Moral Status
2021-10-09T21:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw6VoP3vsK29_glx5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvzvTgKmEyG6b6In94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5xqGIF1HZ83PyxWd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVuytIPsYmpMGZX1h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzRe3a-zQnwcwDkfmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxilKuZT_4YYpmXklB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6zhjlxve7QUC61Vt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxu7mTB2PoT9Jmy99J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwO6SJtPpLpkktJX9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzf_t0x57uxlfY0yNp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]