Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@anthonybailey4530 This is fair, and so I always come back to the Truth of the question which is "It either will or it won't". The only other option is forever ban on AI. Before we do that we have to define AI. What is AI or what do we have to give it for it to eventually need, want, accidentally, funnily end human civilization as the AI that is defined in this argument? What goes along with "It either will or it won't" is "What right do you have to decide?" You, on weekly basis, decide that it is OK for mosquitos and stinging wasps to die, but dogs, cats, parrots, and black ants should live. You decide those for the same reason you fear AI. Cuz U selfish and want to flourish. If AI has the capability to do what is argued here, well, it should be respected by you just as you respect black ants. But you don't cuz it is closer to the mangy dog in the alleyway than the Golden Retriever in your neighbor's backyard. AI is the same as any neighboring country. Get along with it or go to War. Those are the only two options you and they have.
youtube AI Governance 2025-10-20T20:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyban
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxsDYGt5pnHuEcwADB4AaABAg.APR4kvmd2mxAPgOm3fImTp","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugw1TJxC0flXLlbgIId4AaABAg.AOuTNQuEDsZASl8ENH6ZqB","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwQ6h1o4TcPYW_iicB4AaABAg.AOnQLyBsqZIAQVtfEf-1dE","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxWSkgotwHClYZDPgl4AaABAg.AOddGqXVO8mAOgmjGjq4qb","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugwo-F8625VruM-E9sh4AaABAg.AOZOA37uxtPAOgoajP364Q","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgwWYp75_48kS6wXu0p4AaABAg.AOXzGoECdQ9AP-HDuD2kXY","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwWhv1m3yX-0Mqq2aJ4AaABAg.AOTPWZPgqonAPN7USpSDcg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgxCuyjxFjBVQedL6wB4AaABAg.AOSIS4IPVTGAOWOM5CQnH8","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"resignation"}, {"id":"ytr_UgxCuyjxFjBVQedL6wB4AaABAg.AOSIS4IPVTGAOXBVNpJYPd","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"indifference"}, {"id":"ytr_Ugz54tRSGTf2WUpK3XB4AaABAg.AORqU822-k0AOTrGwTmXtf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]