Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You are silly, Ezra. If I ask you should we appreciate nature you will tell me all the usual flattery things that people tell on how nature is important to humanity. But if I ask you is it important to maintain technoligical progress you will also say all the usual nonsense that people say about the importance of keeping progress. And then I will remind you about all the costs of the progress beginning with climate change and deforestation. At the end of the day people do whatever they need to fulfill their short-term needs (that aren't always rational) at the expense of thing that they romantically admire. We are ready to sacrifice all the pretty corals in the sea or all the rare animals in the savana that we admire romantically for the sake of the progress that is too important to stop. Why wouldn't ai set goals that are too important for not beeing achieved at any cost? Imagine you are kid and you watch a cartoon where one of the characters is pretty chicken. And then it's dinner time and your mother serves baked chicken. You might murmur something powerlessly about 'How we can eat chickens mom? They are so pretty!' and your mother struggeling to articulate her unconscious stereotypes will murmur something like 'We can't live without meat. Without meat you will not grow healthy. Meat is important. We need proteins'. Meat is not healthy btw. It's a lot of empty calories and some protein. It's just a habit that is imposed on humans. Human brains are filled with lots of conflicting controversial nonsensial total bs and we believe that this is what means to be human. Alien intelligence will have a much better grasp of reality, I tell you. And there are no gurantees that it will be in favor of our wellbeing.
youtube AI Governance 2025-10-17T10:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzmexWnJbzB4UVydcp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugw7OLpNX_TZUxqq59p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwKEehWUnNlPWy_TWd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyU9nMB3UAMNASSNJJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyP5g2sFlAM953W1SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx9H0IgRcbmLumw7BZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwMPclbDSD7WueoaUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyS6eg3Ahxh9j_h0xl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugza-ErPaJCR14qaidV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyqpjoKAD_xVT18qRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]