Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Being smart doesnt mean the death of human kind.... i absolutely hate the "terminator" argument. Being more and more smart has diminishing returns - its why even in science fiction we dont know what to do with super smart beings ... and at some level of "intelligence" we just give them super powers like telekinesis or literally turn them into a datacenter and then inject them into the net. Any intelligence still needs to be grounded into reality and society. An insanely smart AI or a human with access to a cell phone connected to that AI doesnt make much of a difference. However, AI can collapse human society by being cheaper than humans - they dont need to be perfect only cheaper once you factor in all costs. The elimination of jobs isnt an AI problem but a societal problem and the pace of AI research will absolutely crush society faster than politicians can pass realistic universal basic income. AI that can perform most white collar jobs isnt like at other points in society where the elimination of 1 job leads to the creation of another when AI can do all non physical jobs. It doesnt even need to get that far, just self driving could eliminate the #1 job in most places in the world and these drivers cannot just go to school to get another job.
youtube AI Governance 2024-01-03T20:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx3nnu__d5YU2RJ5Ct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzVKMaTyir5bjT6FP54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxwHwS-c_5L12lJJI94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzOdJZhUzJTvODTpwl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzvhbZAiMQPevaB0zN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgybBBA4rcQWJ_9Hah14AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"unclear"}, {"id":"ytc_Ugz5grv0PlYbLG5EFVN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugwv1Rq6swsmBy4aICd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQtUXcbF0gBKE5tVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxEdgoXR1c2HCV9Dyd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]