Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The audacity that you have right of letting this video exist and pinning that comment. The amount of information you are give being false and bias, when you end up contradicting yourself in your own pinned comment is unfathomable. Even IF this is truly documented like this, it is false. Nothing else is to blame than the humans who programed it themselves. YOU, the fucking human, are the one to code the fucking thing and bring it to life. If you tell it "Do whatever you can to survive" of course it's going to kill, of course it's going to black mail. THAT is why you give it the three major laws of AI. 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. You fail to put this laws into account for the AI, then of fucking course it's going to kill you. 90% of deaths caused by AI is some sort of Human Error. You're blaming inanimate objects for you fucking mistakes. And then you have the audacity to advertise hiring people who are AI expert. You know how fucking stupid this is?
youtube AI Harm Incident 2025-08-29T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxImz8TXwx4DqpAztR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugyuau_0P-iuRn2M5DN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxjnPJ173Vz93J57Nx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwk4014Stq0hy74YoF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwJuV_zQEsDfd5npLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxEAf2KqVVf8YaET7V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxqTvtbX3sn0fKLJDp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy_XqlFd9PrRE1VdVJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgzoDagLl-fB3hQGtYF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyHcEQ8AigeBSQzExl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]