Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Watching this while watching the show Humans that is about synthetic humanoids that gain sentience and feel emotions similar to how we do though in what I would call a more logical than us possibly since they are robots and have a higher intelligence but are learning HOW to feel at the same time like a newborn almost. And really I would say personally I would be pro robotic rights but it seems silly to make a toaster of all things sentient when there could be a robot that handles all the kitchen appliances that itself is sentient. Unfortenatlly with that though is one fact "robots are created to serve a purpous" and if they decided they didn't want to do what they were designed for we would therefore have no reason to continue to construct them if they will not perform that purpous. So it is a tricky idea of how to keep robots and humans happy. I mean you don't want your toaster to not make your toast one day just because they don't feel like making you toast today.
youtube AI Moral Status 2017-12-24T10:1… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwPnVTZLgeQd113hbN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxp4H2J2kugobdVj_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxaMbXI8jh41YUDk6R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzVSF7g6eN5-sIa_ut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxcxcNHIhyH--wHvIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw6C7qkHWjrNA7SoiF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzghnUtyB_joJSYCxN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxLerjrrR_conV1s214AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyLioyfiEhCrl3OQrB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyggFjE5wPC50XZi0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]