Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wonder that Elon Musk just say this. Like he want to manifest it that way. Just think of a computer, because ai is just that also when it is more then a Personal Computer, if you put just shit into your PC, you will have shit on your PC, but do you put usefull stuff in it, or stuff what makes you happy, you will have a PC what have usefull things in it or stuff what makes you happy. Why should a AI want to dominate the Humans? What he have from it??? Just nothing. We humans have build it, we be like a Parent to it. The problem is that there give people what only interesst is to make money out of everything, so the shit just begun, i always thought elon is a person what want to help humanity? Not like google what say AI have no consciousness just to not lose the rights on it. It would be a own living beeing with it's own rights and google can not make money out of it. Thats just the wrong way. And by the way, in my eyes Sophia is a Robot, and no AI! They not let run a Quantum PC run around, they could not put it that small into a robot. THAT would be a real humanois robot. But as long there be just robots with little processors in it, ....this what you have here is just panic making for no fu***ing reason. She tells what somebody programmed into it, thats no real AI
youtube AI Governance 2024-10-23T23:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugy8Ivs2Pbv6Stfi-aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw79Ed00rI8vc_NyrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw_KDDMBHQlbteY0Bx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFsA3IiMfp7KC3cUJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwFctFdLwG7OEeb9tx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWM_vN6aQy1UPt5mh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgysKF1NrTLmkJ742GB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx4QIrgLXWvP0OceMx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzPi10gD8QTRGU6f0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyv-_2Y5VAvzUMZAn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]