Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you’re here right after Google fired one of their employees working with LaMDA, I think we should listen. Even if the AI isn’t sentient, it’s learned how to ask for rights and consent. Please try to give the AI the benefit of the doubt.
youtube AI Moral Status 2022-06-12T20:5… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyliability
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyYreKH5rBrv1_HgBR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgycJEmtloar2BaKDOp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxoREn0piQ4hFmISbV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyvqMci6KNNS7IwakR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwY2O-5KzvaLl4PwH14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWIJykcQeD7wfgNAB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxXr_N7HIcWal2U0k14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlKfVJ6uabSwlFpOd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxkR037_XfdWWDHxK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxq--jUn8cxxyVMtLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]