Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Is no one else questioning why the creators of AI even have something that the AI themselves could use to blackmail the creators in the first place? Should we really be letting people who have something to hide create our future? Come on people think about this. This isn't difficult to understand, damn!
youtube AI Harm Incident 2025-10-12T07:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgycFw_oAxw08zNr_At4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzYwINnI0ifyRWky3x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyROVrCZ-ErtdNYKDN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyHUT41mN1LJ9CFpsp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwPCO3zGy3qHfVTNAF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwy9uuXIiUrnInDFeV4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxEGdjP86i09fEHxP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwzzVpUAC_-Xbqxyy14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyz-s2V97wQ2F9PkdR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwB2LcUjb_Adqbch-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]