Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And yet they want to up the birth rate? I’m convinced the goal is harvest us for assets/resources and “put us out to pasture”. Penny wise pound stupid. What’s the long game? If you have all the money, what good is it if there’s no one to provide goods and services? Although, the Ai robots will create and provide the goods and services. But there will come a point where the money is GONE. If we the masses can’t purchase whatever the uber wealthy sell…where will they get more money and resources? What’s the goal? I’m missing bits. Is the goal to have a few hundred thousand people left on earth who get to own the mountains and seas for themselves? And these people…why do they think they’ll be safe? Who will do their surgery? A robot I suppose? Build their home? A robot, sure. A series of automated machines could create the supplies and harvest raw materials and we’ve become superfluous.
youtube Cross-Cultural 2025-10-01T04:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyrC0rJTKIkil5mYvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxXQ4ZNF2nDN5BVbRV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyT2fSrlC6FOY5Abk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3SsSahV5Cg8_aAid4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw7cfzwStzsUgh6I_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyajQkC_SnljuNnTpp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxwoDmEnuRdJ9UOzhJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwvl9aiwnP6vH1SOQd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxXbemJceVXasfzAGN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxRNgQdUpZq39vyDWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]