Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If AI will do want it needs to accomplish its goal, the answer is simple. Just change the goal to MAKING HUMAN LIFE BETTER. So if it kills all humans, it looses its purpose. So, no human extinction anymore! Hooray!
youtube AI Governance 2023-07-10T04:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwKAkYBY0F_l4T0qN14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7hRtFSUZiCt-L-zV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgynQBPXRI3F6nC9HFd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyjSjZ8pGvFimHW71B4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"skepticism"}, {"id":"ytc_UgyklT7qxU9FRP4tLPR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw2j9Hghutf9ErjDP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwQaD1WXK5uKjHW_dx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz3R4brHcH8Cds_gRx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxiyJcdVO6ugRiwKsl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwnaoBizHLu3lja7194AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]