Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I believe even to ask that question in the titel is evil and probably only the weapon and war industry want us to discuss about that. A machine should never allowed to decide about the life of a human, also a human should not do this, but at least he can be responsible, and it's under a human dignity to get killed by a autonomic machine. I will not watch this shit. A little less evil but very bad is what happening with the self driving cars, in Germany the car industry make efforts in that field, they just doing it quitly. It can't be real, it's so insane, if I imagine a self driving car is endangering me on the street. What kind of feelings are you getting by this imagination? Just because they want to! You can do somthing like this if erverything in the traffic is not driving faster than 8 km/h or so... but, yes I find it insane too to drive normal cars with high speed... And always this dump talk about AI. There is no AI, and will never be, because in the end it's always programmed by humans. I'm getting used to this therm a little bit, but intelligence is something only living beeings can have, an AI is something what a human do, something very complex, but in the end nothing different as if I paint a point at a sheet of paper.
youtube AI Responsibility 2016-08-06T19:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgixlPFeQ1R8H3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgiyK8Zp7jtHZ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiEG0Zg29l0mHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugj34Qf8UOhxm3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgjY_cocRkGtEHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ughzyf_JNlSVO3gCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ughc5f8nD8LA4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggxCaNMyVyivXgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiCjS6CNIM8t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzEXf-BkR-bOUoMASp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"} ]