Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One way I like to think about it is like this. If you cut off a piece of a robot that it needs to function, does it become irritated? Does it actively search for a way to fix it without being programmed to? If an AI is able to understand something besides orders, such as anger or sadness or happiness, it will then be able to be considered sentient. That’s my opinion, at least.
youtube AI Moral Status 2018-05-16T17:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwSLKWZ-gDTvz4iFN54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgyCcGZ11yZ75R6Qib94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzJMvwgYxqz5nK404B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyMBIwcqrgN9UFC6ad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzyLUKfSx0vz_ncIrV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwUa3FLJc02eDN0_Zl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyrDxn8zSt9FQ9vkQd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxuPwG4webi5c3ke0t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxOcpCNNFzntJebGO94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxtcrv1MIQF2uHMgtB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"} ]