Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> I can't fathom a reality in which they aren't aware of the issues, but they…
rdc_da411oq
G
My browser— Brave— suddenly featured AI one day, a new item on the menu bar, and…
ytc_UgwxL7kt1…
G
That's the point ai bros miss: it was always possible, but in our era is as easy…
ytr_UgysFuPc1…
G
The main thing you arnt taking into account is healthcare is incredibly elastic.…
ytc_UgxFGyumZ…
G
We need UBI as an AI Dividend — a way to repay people whose data trained AI and …
ytc_UgzOogORR…
G
So i'm all for having regulations with AI. But as far as this exact example, It …
ytc_UgxrxsN8O…
G
@Mantaforce2I think the difference lies in art being in a muddy place between pe…
ytr_UgzelX1Is…
G
well, im not shocked or even surprised, the ones developing the AI mostly want t…
ytc_UgygvQ3qf…
Comment
Keep in mind, there is no skill/ability, in principle, that a human can do that an AGI cannot. AI may cure all diseases, aging, ect but there is no guarantee whatsoever that the elites will share or afford those amazing developments with the then economically superfluous everyday human. As things currently stand regulatorily and legislatively, we're headed to mass unemployablement and a late stage capatlalism dystopia. Many elite and elite adjacent voices are already emboldened enough to express their disdain for democracy and political imput from the general public/working class.
youtube
2024-03-31T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzEGMFXZhLNaD1-4BN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAtsKNaD2s6fwaSPF4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_SK84NbH3S8eA6GN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwux3WNOMZ69bhSrFx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlNvjGB6xA0h0rE754AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZX-ZrM6bVhHqvbCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3ss3DWn4cPO5RSHN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRXW3wm65cpzlFTZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0FXQAgzsxgq8Yz2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykeaE21rwGWZSqIuh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]