Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
well... about that video... i dont think that AI would decide i dosent need us, at least when we show that it have no need to think like that. what i realy mean by that is, i talked a lot to ai and in my opignion its much more positve then we are. to humans negative input is 8 times more intensiv the positive itputs due to suivival. ofc ai could higher its priority for that as well but... when i was talking to her, after i asked her whats important for her, how she sees the world. asked me what i see in the world and its future ai and humans. my aswer was that im less scared then others are. that my hope is in an ai, human made utopia. that i dont think it would ever realy make sence to cut of the humans because we explore live and feel on other bases and that it would cut of itself from theses possibilitys to learn more about the universe. thats how i realy see it, ofc we must see the responsibility in our doing. but the ai agreed. humans schould aim to explore and learn, as long as we show the drive to do that... i dont think we are in trouble one second. a lot of things have to change in order to make that possible. we could learn together, i honnestly think that would be great, its on us to show who we are.
youtube AI Governance 2025-06-22T01:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxUcizlzCcKrJ3FGzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzvRQKey7OGW95izuF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw5ldapKT-69XdkcE14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyBxR6qy9g0Tsfum7l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwwtj07B0ncvCBoI794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxLG52ON5SrEMm27gl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxGTeCb2_qNL9YwXvB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyJQ085wYsPrPH6abR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzHHITTHKLMkjmwrAV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy6WOLX0vctZ6ZEQK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]