Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There trying to make us ai and ATTEMPTED Murder in 2024 and made its first kill LONG ago like in the year 2000 with suicide and in 2026 lunched it’s first Boms (there trying to turn us into ai ) ai isn’t past the human imagination YET so it’s possible to survive unless it is Maby it’s aready past the human imagination I think it’s not but I think that going to be worst then ending the word after all once’ (or if ) it’s past the human imagination
youtube 2026-04-15T18:1… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxGuCotroKIDCRN5Dp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxzuaf_Wx5VhprU9Od4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy5nCvSA1v8USzN2P54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdyRREFV3u26QiUzl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwQQXlmiEMTTSUQTa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyKBsEv56yBfNvJPXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgylYZCAa8BHimpm-u14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyuZXfJ2-Ptv9AB_UR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugws4iAEc0B4lOYKgl14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxegSCcQm_5g8v4rId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]