Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I hardly see a difference between an AI with a gun and a human with a gun. Both have the potential to be fed with "wrong" data and go off killing "innocent" people. People are already able to defy all logic and go off performing any action that comes to their mind. All engineering, all science, all moral and all goodwill of the world can't change the mind of an ill-informed and determined man. Leaving the guns in the hands of humans is not a future I'm looking forward to either :/ Crazy AI's are respectably horrific; that should go without saying. This video essentially says "Look at this WorstCase" Scenario. Well then consider the "Worst Case" Scenario with humans, aka Hitler, Wars, Executions by IS and etc. Be fair, consider both sides :) I know, there's not an easy solution out there. So if you feel like you're on one side of the argument, then pls reconsider! Chances are you're oversimplyfying the argument :) Good day, people of Vertiasium
youtube 2018-04-03T22:3… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwM5aZIxWW4j5iDuXx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxl09f6L7Rj-RKTPZF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzieqUhMRMtOB8uQh14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzlxnxQw99WAmfHehF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzZoRdIrjkSS-bAyZ54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyqukX4nqlg8PxxK0F4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzjQP_1zTltW_9IU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwH2zhdVVSb5TK2kxh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyHwfpnimaOHxKVZVF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwpUlJEuK97Bnz92U54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]