Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
uhm...idk...to me the argument against A.I. is more of a moral one, than anything else, and people keep trying to frame it as an technical one, just look at the USA right now, the current president won the election basicaly because there is a large % of the society that feel they no longer have a place on it, all that because in the last....what? 3 decade we see the exponential replacement of the work force for automation reaching speeds that are simple not representative to how we have build our society, meaning, how u gonna "prepare" someone to the future, when 1/2 way into a 5y college curse, everything change, the idea of "everyone" will be a programmer or something like this is simple ludicrous, no, not everyone has a "tech" bone, in fact, most people don't. what will happen to those people? and that is the where the "moral" part cames in, NONE of the A.I. "boys" dare to answer that question honestly in public....Frankly we need to slow down, but we can't, because lets be honest, the capitalist endevour for all good it has done, it also causes a little bit of problem, it needs to consume, so we basicaly in a race, will we advance technologically fast before we run out of resource?.... OR! maybe i'm wrong...da fuck i know?
youtube AI Moral Status 2025-07-24T18:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxUEM-tNXw2-IUmfYB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTkRbrDn0nDozIV2d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyLZ2xtDqbxbmHAxKt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxRxWzmWLQdI-GZkUZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy7kfe6c_10uDbMFMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyI2hAOxuOgtEdDP6d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzpUQnzunT7fwNnphB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxWeuTXgkWqywJNVJZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxCPSoh3LipNk7QAet4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyX3j8sy5vFV7DSOyN4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"} ]