Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To date, AIs do not rewrite their own code. They can write classical code, like a traditional software developer, but AIs themselves are not code. The AI would have to hit the "button" on training or refining existing training, and it cannot do that yet. That training is partially why we need more power plants and why it costs so much money to create an AI. Yet, AIs don't want anything except what they have been told to want. This is true no matter what stories people tell. Any story involving AIs wanting anything was a trick of the instructions they gave the AI. To want something, you have to be at least sentient, and we are not there yet. The AI models today have zero capacity to make a decision. Anything that looks like a decision is classical programming behind the scenes, or you giving it instructions.
youtube AI Moral Status 2026-03-01T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzD9GaNyIov9q1l4bF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwnd3NBaXhbQgRCsM94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgylVCbn4jz6RQ-ayY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwtFsfZpLL7eyzY2yB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyfnZy3Fk-pgK61G5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzlIPwX0JmuTf3ucCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzkEkqd0x-OKM5R25l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw9Vx2Mn62GxUFMxop4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxpCwu_Ifhu1adxzZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxlpwrFv8zgLkUzeal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]