Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Stopped watching after three minutes. Total idiot bullshit, claiming AI actually ACT somehow, and doesn't just guess words based on a query. You'd have to be a complete moron not to realize that none of the above examples would have appeared without the user asking them to write them. It's literally like blaming a notebook and pen for the fact that you wrote "Hitler is cool" when you wrote it yourself. A generation of imbeciles.
youtube AI Moral Status 2025-12-13T19:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyUC2SawyaSD7pedXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy751SDTFf9seHVGQR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxp9OgXFk4MTII3YAd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"sadness"}, {"id":"ytc_UgxZeE_ZgNdSrP2eBtR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxM5tKdBgQ9Q0UqtAZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxG9xPXRxohjg99opt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgypSpK_BqZ7OpZc__l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw46TC8VZYiRe_lM314AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxtKxw5SUwYDBa3yBx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyulCFpT8PfcwVlWk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"} ]