Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine having a boss that never needs breaks or wants a raise. Sounds like a dr…
ytc_Ugw7kJBt3…
G
The funniest part about the “Ai is Theft Bros” is that pretty much every…single……
ytc_Ugx5mf0li…
G
Never going to use Ai but if I did, it will never know I have 36 kids in my base…
ytc_UgzCgcJdB…
G
Holy shit, I came into this expecting to make a "Same, bro" joke in the comments…
ytc_Ugzbzwk2X…
G
I‘ve not yet read the fundamental question: humans are regulated by Law. AI is n…
ytc_UgwKhBEWk…
G
@Harrier_DuBois oh god. I just had 300 words on this and was about to hit reply…
ytr_UgwMkgH1D…
G
Don't worry, the AI bubble bursting will soon be the least of our problems, beca…
ytc_UgwolxR9E…
G
Hi. I'm a student and looking forward to be a computer scientist/programmer/soft…
ytr_UgxEevnHa…
Comment
There are few problems with that, the most obvious that pain is a mechanism for survival, there are people that have a genetic condition that makes them insensitive to pain and is bad for them because it makes them more prone to injuries, sickness and deseases. That for a robot would not only mean that it wouldn't be capable to stop when risks to its integrity arise, but it could also add risks to other robots and people.
But there is also the problem that if could not needed to be programmed but emerge due to complexity. We (and also all other animals) experience pain due to some evolutionary process eons ago, and we should expect at some point an AI system could develop a similar mechanism, even if it isn't a true sentient system.
youtube
AI Moral Status
2023-11-22T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzwTRaYUjVnwDc3lsN4AaABAg.9up5lwPGrjs9vVH-zBG6Ud","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzEFg39EsPysHNc5St4AaABAg.9rAMnTyDjNgA3uKRdIO6eJ","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxQccf-8B4TUrqGyMl4AaABAg.9pWVj0tUQ659q14WXqm5ph","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw-oRlWGSYn-QvXvJd4AaABAg.9pR78enMEBd9pftDObyFG9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyGhxmohp6NLWlX-y14AaABAg.9p2EKfuLRIi9psFg5fP4NB","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwzrK8GJXk6qhNKWaF4AaABAg.9l0fzL_YZ989xPLp1Fc2bg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzuR-nEuxysUKMP7Z54AaABAg.9kcCNArsIT-9keuNEBXZIp","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgyFIflQZQnAm-bPG2R4AaABAg.9jyqnTNMsU_9o7cFF4aHkU","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UgwwSRK1vI-yuESCN7V4AaABAg.9jlOW_XL-Z49kN3L6nup0c","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzzUkbf2TBm2KpnI9R4AaABAg.9jlDr304qci9o7b8IS77hO","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]