Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
UBI could solve all of this, though I'm not on board personally, with AI or UBI. It's not our economic structure that's the issue, it's our leadership. We don't have the right people to manage all these changes. For all our intelligence, thinkers and wonders, humanity seemingly does not have the ability to conjure up the right leaders at the rights times, at least as much as we'd like, at least recently... Humanity has no leaders worth following right now, not for the scale and speed of changes about to hit our planet. Can you think of one who can oversee all this coming change? Some will appear doubtless... eventually, but in the interim how many hits will our species take? With the speed things are moving, will we make it to a point where we can still maneuver? AI is moving a lot faster than we move it seems. I'm extremely skeptical of AI, due common sense and historical knowledge, but it's more than just intellectual conversations and far off hypotheticals about AI, we're talking about existential threats. We almost annihilated the earth in 1961, and that was a minute ago anthropologically-speaking. We still haven't solved the problem of nuclear annihilation despite what people would say. So one given variable on the existential threat axis. Do we really want to add another? Do we really want to play with a technology we can't be trusted to handle the implications of? We are still very young and dumb as a species... and I consider myself an optimist.
youtube AI Governance 2023-12-31T08:3… ♥ 3
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugyjmc0I8pxKjr4gIMF4AaABAg.9yyLpopwLyL9z6XLJGeQwA","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgzDyfwk9HibtxrCzIx4AaABAg.9yyKgfxAuPT9yyc_r5jK5O","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzDyfwk9HibtxrCzIx4AaABAg.9yyKgfxAuPT9yzCoJi-gQo","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzDyfwk9HibtxrCzIx4AaABAg.9yyKgfxAuPT9yzez4q6hSN","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugw-vK8R4T_XfigaeTp4AaABAg.9yyH3DBWJXx9yyXg3Z54PC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugw-vK8R4T_XfigaeTp4AaABAg.9yyH3DBWJXx9yzz351Gn2W","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgwfwbP9YkYtoEX0FV14AaABAg.9yyFxuGKc3H9yyacZtKkS6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgwfwbP9YkYtoEX0FV14AaABAg.9yyFxuGKc3H9yz2DQDBtdB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgweuiImBq4PQVwaXXN4AaABAg.9yyDV-mqjeN9yyw2YSB8pn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgweuiImBq4PQVwaXXN4AaABAg.9yyDV-mqjeN9z1S_gSC9fi","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]