Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The idea if autonomous weapons is horrific, I know enough about vision systems and automation to know that programming such a system would have to be flawless. And even then how can we be sure that when there is an "accident", and there will be, that it really was accidental. There will always be doubt. Don't get me wrong these things are coming, I just don't like the idea, too many variables.
youtube 2021-12-06T17:5… ♥ 38
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgygOlWtPg9XGnUh1wZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugywe0sEhY6O5Fc3Oh14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzH2cRoKalO1AV0kPF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxTp0rvOO6jaxbEpj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdYA7jB3Bgmmj-UBJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgwkGtFvQPlS5hwCQPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMTgDROWQqc_HjhIp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgygzkVpL-gI3qe6d054AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwDrP_0u555N0nbbuh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugwp5R11gdCAFGAjyRV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"} ]