Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But what would be the motivation for a robot to do such a thing? Wouldn’t we have to set it up to have that as an objective in the first place? It’s not like robots have inherent drives and motivations to do things
youtube AI Harm Incident 2023-09-12T23:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwQTGTpspvo7a3ESaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwziopYYRcP_BFXW6Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwhsSY44dn948nHEK14AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwi_uF1Q985fVSbKll4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeCi6RuyZMYfpe0QZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwsktk_GHxwphmstRJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxRmck242GiFOw5GzR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxT3RPayH0wsG2SS294AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyQ9MsSRl_QjQHJD6B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwdcv6C6yN70d1uOel4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]