Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This happened in 1979. Robots did not exist in 1979. The internet wasn’t even public until 93. A piece of mechanical equipment such as a press, a steam roller, or furnace, all which could end the life of a human easily are not autonomous. So this man was killed due to a human error when controlling the equipment meaning that to this day there has not been a human killed by a Robot. Also, not a single country out there has hardly even figured out how to make robots that can self stabilize, the technology is so early in its days that it’s hard to consider anything we have out a true Robot. It doesn’t think, it doesn’t formulate, it doesn’t make conscious decisions, it does only as it’s programmed to do. We don’t even hardly understand the human brain so the idea that we have come far enough to have autonomous robots that are making the decision to kill humans on their own is laughable at best.
youtube AI Responsibility 2025-12-21T16:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwFcIhmPBmzyBG3Tud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyNPvsIHMO0cUzw9SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugyn3b6hw7hOcXdae8R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwFsRRwIid_EZ3zA2F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyYl1dPUYCHOoXoy3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyxrXRiIfnyfJoYJgZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugz2lHsIl7RWse1adnl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyFyN3SwIo-PupZvZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwfJ0tPTF0K4Ny3SsF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxOqEnBCCIFcPNAnYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})