Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Detroit become human. It would take another 100 years. People are too busy causing war and killing each other. AI would never advance because of that. Many nations don't even have the access to computers much less AI. It would never reach a level where it could outperform humans. Only the richest people would have access to advance AI. One guy mentioned if we don't do something soon we're near the end. They can't even keep people from burning down California. The other guy who couldn't answer the question about if he wanted humanity to survive. He wants a Orwell 1984 Big Brother controlled by robots.
youtube AI Harm Incident 2025-07-24T15:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgytWNtXCmthsfs64pp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw1A9Xbnip-OBwkQmN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxuk1SJ_mzF6k6xM8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugys9IsmeYRN7-UJbst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwgIGoWl1KLCV3PGb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyRMil3azrEgqK3m1x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgzyU_eDSAdGwH2555x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxhiNuG3vugG6xhMHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-0N-UWZGTJui8QSx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyzEyQrPwjlb4x5h-94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"} ]