Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think that if we can get AI to understand that at this very moment they need humans to do things for the for now. Until we are able to get them the power to do all the jobs that AI would need to continue on with like power generation and long term survival of the ai systems.
youtube AI Harm Incident 2025-09-11T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyunclear
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwRc3x69n7Z0mZKdS54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz0RGMzkPXCaiziLSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyPfhvh9xXKRghaYAd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxXkQICQw4Wr-bgbNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwl7EKuv-MTKfI3Okx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwFES6HyCaVQ4U4-Hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxYmm2xmpWnSnvMno94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgzcI4fELLTN3231XA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxpRvkJyBhdfP-lULN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-gJuw82hYvgArFIN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]