Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
With no sense of morality, noone within pur society can coexist. We have to setup parameters for the ai, its like dealing with psychos they don't understand why they shouldn't use something thats immoral if ot gives them advantage.
youtube AI Harm Incident 2025-07-26T09:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxbQi9de76edWf2MVJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxzM1X2g3GxMrlirrZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzwEWaZxlU3POji3PR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyjkYLnpTF4CozCpGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwRIcBxSsM_zSxyZt54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw3WZ_nOHfIjYDTjcl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwoH35RdD3yzEFllNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw9PGDCFbGyl9eFJDd4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyiIz7cSjo8I3T9mQh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugy-p6TOs39rhzFD-KF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"} ]