Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here's what I see coming. By 2100, automation run amok makes the vast majority of human labor obsolete. Mass employment will become the new normal. Those who control the machines will become the richest most powerful elites ever. If they feel generous, maybe they'll throw some crumbs down to us in a meager UBI. If not, many of us will be driven into revolution and criminality, which will be crushed under the iron fists of killer robots and jackboots of fascists. Most elites will literally wall themselves off from the rest of us rabble and not care if we die because we no longer have any economic value to them. We're ruled by silver spoon-fed sociopaths, so don't expect them to grow a conscience. And maybe more people escape into increasingly immersive virtual worlds because their real world sucks ass. That's not even the worst-case scenario. That's what's most likely to happen. Worst case scenario, we nuke ourselves back to the stone age in a political dick-measuring contest. And I wouldn't rule it out, not with people like Trump and Putin heading countries with the biggest nuclear weapon stockpiles.
youtube Cross-Cultural 2025-10-14T08:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugy79KU0MMVj7g4HZsR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzzGa6-7HEutJ_Ehc54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzN1JpelHUJ-xWj7GF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyijgsrq9uOktByf-l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyYJiPxIiByUlWZ5Z94AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy_zBvZZr20-U7aOrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzfzdK2rtX8lbx1t2Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwbwk4IfbilIoW9JEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzXATGc2_D5c-UHOzF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw2-8v2TI0VvlJeg4V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]