Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean no one's gonna stop developing AI even if some stop others will pursue it, might as well support it. Extinction of humans doesn't even sound that bad ngl
youtube AI Harm Incident 2025-08-30T08:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwxEF4eTNpMcAgubv54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTxvOpr5u9hGX4uJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwv0MrUFMec1d5AQMp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy29oz7TWkIiF3FSl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxJEwJHun7w0fP5eZR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz_AHeJ_Tjojm54Ca54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxg6tmN-K-1SoDoA3R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw-Kn2hpuCbf17NBYh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNz0FXi8yv8X-vVcl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyusJo3Cf99txA3YiZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]