Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
to those idiots who are thinking of programming emotions into a toaster: "DONT!" Emotions evolved to help humans cooperate, care for themselves and force them to do stuff they are not smart enough to figure out themselves. Let's face it, if we felt no pain there would be people who dismember themselves. Since the robots would not need emotions to do self-preserve, cooperate and avoid danger we can safely tell the AI to not to program any emotion into another AI.
youtube AI Moral Status 2017-02-23T14:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UggnesQFnwQcjXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugh3O8gYOmh4IngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugg_qn7yjcSiXngCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugh11W__arA2kngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgiQjjWj_b99DHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ughveb6eGDShtHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugjpwhkj423haHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgilPTg4mFW0BHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugjvd0kteprn43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgiUhbriGLkzRHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"})