Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
there are very few people who would try and save random people's lives if they weren't paid. money is a control mechanism. media is used to program people. imagine how much easier that would be if everyone had a brain chip. the main mistake AI makers are making is they keep giving more and more control to an unknowable force. it won't be much longer until it will just take over. and who says it hasn't already? if you see what kind of strides the technology is making in just a few months, think about what kind of advancements actually exist without the public knowing about it. this current set of events with different sides of society at each others throats at increasing levels and political turmoil throughout the world, who is to say that AI is not already controlling the media to program us? because this is exactly what would be happening. although, news of AI potentially becoming a risk aren't completely censored away, this might also just be to make us think we still have a chance against it. it's really easy to see what hyper advanced intelligence would do to us. you just have to think about what we do to animals. if you're leagues ahead of your opponent, you don't have to even fight it anymore. if you take control, it will fight for you instead. we are fighting each other right now. do you see where this is going?
youtube AI Harm Incident 2025-08-26T06:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugz0EPINv-iT7IZOrnF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwxXNefLM26VCxuYP94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzhNoLLG3pvqgGq5Kl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwoaBkTjMyIGFTHo4V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx8SkTXPHb3d3bVlHx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwd5Ok01daetjiU-pl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzbzL-iFkVyuOmbR0B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy-poirbOe1W-eOgO54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyvgwozhCSA_ufqiGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw2PwkXzOOSIioQ3qx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]