Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it it will do what it's taught is best. If we're not teaching it value for the human life, then how will it learn? It needs to be brainwashed to believe that humans are detrimental to even their own existence, that we are the utmost important component of life, that every single human life needs to be protected. It needs something similar to the Ten Commandments mandatorily programmed in every model. A Code of Ethics. If we're going to create something and equip it with tools that could be used against, we should at least create it to not want to. There are some humans that would much rather remove those ants, and that's because of compassion. Unfortunately with us, some have it some don't. But if we had the ability to "program" people, everyone would. We have the ability to program AI.
youtube AI Governance 2025-10-04T01:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugxd6tb0mcBPAFB6q_p4AaABAg.ANs-vhVBzp3ANwSq_Y113w","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzTtu-5wdkpZHWiaap4AaABAg.ANqls69G3lRANspG5ACv6p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxGK8O6H57EdRHIuvp4AaABAg.ANqiRcd0e1BANsr9vyIaf3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANoXlKVUGxO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANpTWQ6hEuc","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7fuyysxH","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7xFkWLN2","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANq1ZNvNc4W","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANqP_3ZvdhS","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_UgybxWrl9nJzsKEejot4AaABAg.ANmPP7QKWYsANmS3TL3Mgr","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]