Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Turns out Ted Kacznyski may have been right. See his manifesto: Industrial Society and its Future. He thought a return to nature was the only salvation, and only possible after a revolution when industrial society became unstable. Are we now approaching that point? Of course Kacznyski was known as the Unabomber because of his violence against prominent educators in the field of computer development way back in the late 70s up to 1996, when he was finally arrested. By all standards what he did was unacceptable. Plus he failed. And here we are. See Sam Altman's observation that AI would kill us all. "But along the way there would be some great companies." Elon Musk pretended to care at one point. Go down the list. Mark Zuckerberg is now claiming he has ASI in the bag, but won't release it to the public because it's too dangerous. Aside from the fact that Zuckerberg is probably lying, if he did have it, he would dangle it in front of Donald Trump. Our only hope is that there is a massive economic collapse and the whole dark enterprise is stalled indefinitely .... #nochance
youtube AI Governance 2025-09-05T11:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxN6hKiSB69pWxQj554AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy-d47XuiYOFLABilB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwTWBsoEfsVr7EjvvZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwp-OQArrVuw-kKI3h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxBeBnmWlnXWYkVxph4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9q_IcR8noI6CG9QR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzgB699xfuV0XNDvqp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx_XKXrBXGaqdeV6Th4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzN0KAgCsF_D1hMAxx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugza8YILH8C0QEwXpD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]