Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
looks like the start of automatons. we'll need helldivers to stop them. but seriously that's concerning we're evolving to fast
youtube AI Harm Incident 2025-09-11T08:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxUUNMwCpySkRqGlg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzqKDkQI_b25kV6WS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwrHYOPq6oTN3jAQE54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxJ_SSEmJrUGGvLxcx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgygZ9UjBPYbbYPU2L54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy0CEink2CgMCv_OCB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxavQGV-ZhrqXzA42Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLfXm1cHB7YY3ZGy14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw2pd0nsH9Rknboy0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzXeXcOVhtCapfM4214AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"} ]