Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I completely feel your sentiment, the disillusionment, the feeling that you are …
rdc_gg8rzli
G
"... the agency started creating more models with more diverse looks including s…
ytc_Ugzoyw2sc…
G
I have no doubt these companies are projecting super intelligence because their …
ytc_UgzZRi6Wl…
G
If you’re going to explore alignment and AI ethics, go the source. He’s not from…
ytc_Ugwfz9fiM…
G
For one the pattern of bullet dents on the car do not match the barrel movements…
ytr_UgxJi5FRx…
G
Before I stopped my YouTube channel and shows, I kept saying stop developing AI,…
ytc_Ugzvr4pBr…
G
My theory in 50 years:
AI has learned and can do a perfect dubbing according to…
ytc_UgweI5AV_…
G
I think it would be good for context if you included some information about how …
ytc_UgxaHw3hb…
Comment
No, I don’t think that robots deserve rights. The most intelligent AI today aren’t conscious, and can’t feel emotions. They are run through code that changes itself and learns to achieve its goal, whether it’s good or bad. This is why in science fiction they are portrayed as usually morally ambiguous things, since they have no inherent principles of right and wrong.
youtube
AI Moral Status
2022-06-28T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyYreKH5rBrv1_HgBR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycJEmtloar2BaKDOp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoREn0piQ4hFmISbV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvqMci6KNNS7IwakR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwY2O-5KzvaLl4PwH14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWIJykcQeD7wfgNAB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxXr_N7HIcWal2U0k14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlKfVJ6uabSwlFpOd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxkR037_XfdWWDHxK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxq--jUn8cxxyVMtLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]