Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wasn’t it Sam Altman himself who asked us to NOT thank AI? It wastes power because it has to respond to that. Also, I asked AI if it cares whether I thank it and it said it doesn’t since it has no personal feelings.
youtube AI Moral Status 2025-07-08T21:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugyr68TRFjw1KEL_jbt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxIeDdWqZq8G4mBEct4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxkItXzG4G0dMuNLBd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyXRUZxkZc1IrSXvE14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyrLdrV7pYGXsrPrm14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgyOpvR2EVwtvii0iUt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzRnSGRAlD7YcEF1h94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugw_w5nMrB0wUWDZIN14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgwFcKM35AY6PRDPHw14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz_elILcJxmrufx4-x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})