Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LOL no they are not. They are programmed by humans and trained by dumping massive quantities of data into the system. Humans program, and humans choose which data is used to train them. The kicker is, if the companies had to actually bother checking what was being uploaded in the training data then it would be slower, more expensive, and expose them to all kinds of legal problems. Even the "we don't know how it works!" arguments are suspect, because they conveniently absolve the companies designing the software and hardware of all legal responsibility for what happens to the users. And the second anyone suggests regulating how these AIs can behave companies like OpenAI and X-ai and Google start freaking out about it. Look at their response to legislation designed to prevent AI from discussing or promoting suicide ideation with children. The responses range from "we can't change the software we don't understand it!" to "but then your children won't be able to use AI to learn in school!"
youtube AI Governance 2025-10-15T13:1… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzTUX4S2qgUqiJemIZ4AaABAg.AOIdCeQp0eLAOIdfZLfWtD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzTUX4S2qgUqiJemIZ4AaABAg.AOIdCeQp0eLAOIfCm8km1h","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzTUX4S2qgUqiJemIZ4AaABAg.AOIdCeQp0eLAOIjGaojRuV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz7NLN0LaFZsPjsD2t4AaABAg.AOIc1W1LHUoAOJMKkakbjO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzNvnOsA91yfUEIyNV4AaABAg.AOIbxUa4vX7AOJ5bd6nfLV","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzwO74sCUD1PhXhr6N4AaABAg.AOIaw2HiMp2AOIh9qQqsWr","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzwO74sCUD1PhXhr6N4AaABAg.AOIaw2HiMp2AOKBuJZBfKp","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzwO74sCUD1PhXhr6N4AaABAg.AOIaw2HiMp2AOL6pVDhxXO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzg5SR8FiTJBLtgp3V4AaABAg.AOIaZswqwTjAOIfuB7vLY5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugz0v6HzYZMQayCzDdJ4AaABAg.AOIaMMKMUroAOIbCwwtSgI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]