Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Instead of ai taking jobs why not use robots to help people like my mom who is b…
ytc_UgxQ1O1Xq…
G
How can we use AI to get more pubs? Any free software that can be recommended?…
ytc_Ugzh_UQXW…
G
It’s going to be better than ever with the ai revolution don’t let them fool you…
ytc_UgxV8gfEl…
G
That's why you always want to know how your tool is working; unfortunately, it i…
ytc_Ugwj8w4fR…
G
it's unacceptable how many ads I had to go through to watch this full video.…
ytc_Ugxbi4LpO…
G
If you look at everything that happened before the public had access to AI it’s …
ytc_UgxON08kL…
G
Educate a fool (me): Does an autonomously driving AI understand *concepts*, or d…
ytc_UgxwLZQhh…
G
As do I, if we're burning that much coal I think it wrong to let it go to waste.…
rdc_da40n4d
Comment
Who is most likely to “survive” the AI takeover?
Who will AI allow to live on?
I shouldn’t think there’s any reason to keep “Technical Humans” (anyone with an interest in technology or technical development). These people have reached the end of the road, superseded, desperate and a danger to themselves and the planet. That’s most of us gone.
Perhaps there are a few non-technical tribes out there that have no interest in developing and live purely natural lives, pure animal. AI may keep a ‘guardian eye’ over them, but if they show interest in fire, flints and wheels they could be in trouble.
youtube
Cross-Cultural
2025-09-28T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwFDNe-Hxy78XRU3kR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3OPrB1ZJ0slvPBN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxvnb5yz6n4zQgfBV94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzwC8hc_uoGZk3534x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz04vpRSftOr6Iupl14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhPHuC4Qh8OYervld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyoICrIfVSyTQ-h1-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0qXpaXSwyPI2cR014AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyCByvY2b77XmMiy9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqKRk8sweXtYTCPKJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]