Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hi humans, do not forget that smart people also created the most destructive weapons using science. Smart ≠ wise. There are many smart people on this Earth, but few wise ones. Most of what we call smart people worship their egos and status. The most sick and shallow people are in academia and research labs. The environment designed them to be AI agents, competing with each other by reinforcement learning, starting from the nuclear weapons and before it, before the AI era. The most devil combination of a person is being motivated by status, ego , or childhood traumas and insecurities, without any moral motivations. These combinations find a good soil in secular, rich societies. They even invented IQ to feed their egotism. They paid for it in two world wars, and they will pay for it in the next WW, and when AI takes the food from their tables and turns them into slaves for a few big cooperations. Have they learned from history? No, because they are smart.
youtube AI Governance 2025-09-07T04:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugynqxqep33XKT0Drcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzS04h_C9D5FYQS_0R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx_WTuHBQJiOZtqb0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz5jMhvbr4ssKj7m7R4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugxrs_nz3eAkkxKKqEZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw2E0y_3y71Lt0MrOV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx230VedZd87OnEOYN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx-L_KJ5QiKOA4p0Nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzZZ2a2GwmDRAFv0Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx_TYtbtkacXu1jDUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]