Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In the future, if a robot wants rights then it should have rights.  But I think if you build a servant robot that wants rights (meaning it wants to do things other than what it was made for) then you're doing it wrong.
youtube AI Moral Status 2017-02-23T22:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]