Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Generative AI uses statistical models and like every statistic the results conta…
ytc_UgzloCNEc…
G
Risk factors cause brain damage. How bad it is determines what doctors call it. …
ytc_Ugw4W_Z0l…
G
Pretty sure this is inteded to be inflammatory. They're capitalizing on the rage…
ytc_UgyWKJoya…
G
Why do we need Everything automated ???? EASY Robots & Machines "#1 Don't…
ytc_Ugzj-mO08…
G
Dude, are you dumber than ChatGPT? Why are you treating it like an intelligent a…
ytc_UgywaBVI4…
G
The HAL thing has to do with memory damage the mars maintenance crew did by brow…
ytc_UgwZMFDoW…
G
The most significant point about them is completely missed. People die. Musk, Pu…
ytc_UgxTvOlcb…
G
Well now im thinking about AI drones with thermals hunting the last of us down i…
ytc_UgzkEkqd0…
Comment
The risk with UBI as drafted today is that it would serve big capital, which already shapes government policy. Non-compliance with state or corporate interests could mean losing your livelihood, or being forced to work without traditional labour rights, effectively trading labour for UBI. Work and jobs provide autonomy, and despite political rhetoric, we are still far from true AI dominance.
youtube
2026-01-10T15:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw3ZmOVOyoYr_grxzt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwR-tzpHABBbsLfaph4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyKUNBZWaJ6aEd-HId4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMWdccyZFOZ69Xup94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgztkV0wS5juP-Y_KGV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzLXwyJt1nHPjtW-XV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzBQ6t9aEx9-y077lJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy62YyyizS4Cadl9jh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwL-oqiLQGdrs49wGV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzlWENyCB7jJ8Y9RZl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]