Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that self preservation is one of the fundamental things needed before sa…
ytc_Ugz1Xzid4…
G
Next thing you know they’ll make AI soldiers because nobody wants to die. No one…
ytc_UgzMfK2dx…
G
How the f is it legal to use ai in the medical field when a known problem in ai …
ytc_Ugw83CSHu…
G
@mikegrindstaff I'm referring specifically to superintelligent AI. A superintell…
ytr_Ugw24WZzQ…
G
Look back to Lars Ulrich’s (Metallica) arguments in Congress about Napster. Is A…
ytc_UgxiSBXjU…
G
Honestly, I think the way things are heading, the only way we’ll truly keep pace…
ytc_UgxgZa4Bx…
G
29:30 the thing that's not being considered here is if AI kills all the people, …
ytc_UgxioJ6OW…
G
AI actually means Alien Technology. It's entire purpose is to destroy the human …
ytc_UgxfYDeDa…
Comment
Universal basic income will have to be designed creatively in each country. The world is reaching a point where automation and AI will replace too many jobs for the old system to keep working. If governments fail to adapt, people simply will not be able to survive. It is no longer about politics or ideology. It is about economics. When machines outproduce humans, we must rethink how people live and how society sustains itself.
youtube
Cross-Cultural
2025-10-07T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwyWjrupENz0if0az54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwCnQF-NZz0-1Y28S54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyvUHJVGyXXOqUQahZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOQW-sbI2OhrV3YEl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwXYvAZrk72ouGrmcp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy3ZokfA4489qyFF-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOMRA7nMqjlpi_Hf14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugzsrr8Z6q3rQWf_Bop4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3pEWvnKw7aX4PvH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9B4zxaJvqycmeffR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]