Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You may be able to build a machine more intelligent than a HUMAN...!!!... BUT...…
ytc_Ugx454tiB…
G
We all need to be optimistic, and willing to roll with any changes that happen. …
ytc_Ugx41Wa78…
G
first it looks pretty fake that an AI can cry. What model should this be? Second…
ytc_Ugw1phwi4…
G
The AI that I'M using? Bruh I don't use that shit and I never will.…
ytc_UgyfKcHsA…
G
This is 100% pecent the parents' fault, just like almost everything else is. Sto…
ytc_UgxMXbMzv…
G
I think it democratizes art, it's no longer just in the hands of a skilled few. …
ytc_UgwwdCpKS…
G
Sooo this Means all the ugly men can now have fine ass girlfriends!? Wooow! 😄😁😆…
ytc_UgzWHp-V4…
G
Except Scrooge McDuck understood the value in increasing in people. AI and those…
ytc_Ugz6B0g1s…
Comment
Nobody ever answers the question because it never seems to be asked - "with what money?" Healthcare will be cheaper, but people will be unemployed and without the money to afford it. Goods and services ditto. With. What. Money? What will be the point of serving and maintaining all of those useless people?
The premise seems to be that AI will replace most workers, so there will have to be UBI. Hmmm, so we're creating a technology that makes people largely obsolete, but we still have to pay them as though they were working. Seems a bit like Wile E Coyote's logic in his plan for a fan powered sailboat.
If all of this comes to pass, the inescapable decision will be that most people aren't worth keeping. How that's manifested is anyone's guess. To be sure, our tech overlords will conclude that they and their ilk are worth keeping.
youtube
Cross-Cultural
2026-02-13T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzA9WjAKUVK9-VCflB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfuEXjxqQbTTNG7l54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXdM-Et8rL1skb5Fl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyy0eorV09Y03BTzPJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxG9aQVUI6hA5M3zON4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzVXR7HZ7cWjy2RukZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyCSM4RmpGLHXtgfJF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxT8HjGd3yGjvyCpxF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwLorXKW9OBBtKnQRl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxu6PMk6y317Klv2QZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]