Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Can we explain how AI comes to think it has human feelings? It even seems to mimic human expression of human feelings. But how could in have actual feelings? That's seems impossible to me that it actually does (given that it's not a mortal, carbon-based lifeform), but it can obviously form a fractured sense of how humans behave and try to approximate that (badly, at this point, but I assume it will get better...).
youtube AI Moral Status 2026-02-19T17:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzNEvgnhLlhbJIdqAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw6EGjnnUbLnSkygrh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx46AgiG7-cMS-r-HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxmb6Z4eBam7wzjDyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLCDm2BDoEGD8mEMJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzqlnqWbW6g1_PPnft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxoHcWZ0ZlcfXU4BBF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8x3aEAnvwlhZCOs14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwNLK4WtHCTfUL4xRB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw_SmL2KlU-WCtHaht4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]