Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Is that the lesson and the message? There are many other possibilities in addition to not wanting to teach AI to lie. Another is to worry if AI is trying to think for itself bc it can come to the wrong conclusion, ie to let the astronaut die. It is negligent and murder to let someone die or do something that will kill a human. Machines follow commands without compassion and human’s have laws but the most important law is to LOVE and not cause harm. Only a deranged callous psychopath would want to kill or allow harm.
youtube AI Governance 2025-09-01T04:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyEridiT-MacI95HlV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgydhiOUNmoWa8U5jDt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyzBjWE8EFcMTou_u54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxIe1zo3zPSfTeh3pJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxvEBuxtikF6QJKIsd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy-L-MGQdekYYFsjxB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz9uIjm_HlHt83xqIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_Ugw7i924xQVUqYrtPFR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx1wJAirZe1OEy3APJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyi9KRWjSaHiXhNZuV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"} ]