Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This interview showcases the generational difference. I honestly feel that the average person’s capacity to reason and articulate rational arguments has been in decline a long time. Not to say stupid peiple havent always been around, but we used to hold people to a much higher level, and the kind of dialogue Shatner is giving now used to be ordinary. It seems extraordinary because we live in an idiocracy. Shatner expresses the simple and self-evident yet profound observation that a machine is only as moral or immoral as the man who programs it. The million dollar question is, whose morality gets programmed? We disagree on so many issues about what is moral—abortions, prostitution, gambling, sex outside of marriage, drug usage, alcohol, clean energy, transgenderism, wars, etc. We’ve reached a point where nobody can even agree on basic truths anymore. So I repeat, whose “morality” will get codified into the a.i.? That seems like it ought to be up to voters and the public to decide, not CEO’s in a corporation, least of all leaders of the tech corps. They’re the last place I would look for a good standard of morality. Already we can see huge bias in the answers these bots are willing and unwilling to answer. Just another means for the elite to control information the rest of us have access to and control what people think, read, and say.
youtube Viral AI Reaction 2023-07-01T00:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyGdGXWfz9mGc7WXhR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwecOEJtg_FEv94rkd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzMu8NO8iOZ13ZehyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy6aa1QSX73lVGpUhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgygaYR-25ZbP6s-ZXR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxK33a95G0gzZmxVCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxeWK4wBtCVgLVxtjh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw4zaTOAG-4PhETrwh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz0jwdryC9l_oBkbF54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzgDK_jRdxXvduw6Rl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]