Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If people do not take reasonable care, and a person is hurt or killed, the person who did not take that reasonable care can go to jail. That happens all the time. Left an unsecured gun out and a kid gets killed? You go to jail. If Sam Altman and the rest of these people faced the prospect of dying in prison if they build an AI and give it access to the controls necessary to operationalize killing people, they wouldn’t be talking in the cavalier way they are and they wouldn’t be at all tempted to roll the dice with a 10% chance people will be killed. AI should not be given control over systems, it should be able to recommend only and people implement the actions if they make sense. In no situation at all should a computer be given control over the nuclear button or the ability to control the power grid, or deliberately destroy crops, etc. If a company does that, it should be a crime that the people involved go to jail for.
youtube AI Governance 2025-08-30T20:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgwB7V4CYxOmBTfx7sl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzu33lIKGJ07SD1BIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxhOjRpNDYmovr1TsV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxMs7Rxh1zab6V6Uvx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzMXinURlHf8LYW0X14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]