Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Really interesting interview, thank you both! Really interested in this idea of …
ytc_Ugxtz0chj…
G
thats pretty damn racist, south korea....
maybe they should open their eyes a b…
rdc_clvobns
G
Well AI are buult by people soooo 🤷♂️ qnd it has access to whats going on in th…
ytc_UgxEi8iVv…
G
He gives it away at 6:49. This isn’t a debate about whether a particular AI is s…
ytc_UgzbG0CuB…
G
Alex O' Connor interviews ChatGPT somehow is charming while Alex Jones interview…
ytc_UgzDMKPjI…
G
That's an interesting comparison! Sophia does have a unique look that can remind…
ytr_UgxbarliD…
G
Ingroups always dismiss outgroup conflict. I doubt she's likely to be targeted b…
ytr_UgxWbU_JC…
G
Yeah for now. The moment it gets smarter why think for yourself? We are buil…
ytc_Ugys7ylHk…
Comment
AI replacing humans is fine, the problem is not being able to create an UBI ,.,
In theory, you could make farms for resources like food, water, and AIs to build shelter, and this UBI-2.0 system would automatically distribute resources for humans as they are needed, because well, these farms only need to run on electricity to produce food for humans, humans only need food, water and shelter at the minimum, so you could keep the population alright by giving them all of this but no money, in fact with UBI-2.0 the regular population doesn't need any money as long as they get the resources they need to survive, and yea we all want commodities and luxuries like clothing and such....
youtube
Viral AI Reaction
2025-11-22T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVj0yBaCsWshzkzOp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrfIEr12XagQLno9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyugDQQQHInXD2_k5B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwQYJbetM-F-Ptrhmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGTAT7f4qY1DY3aj94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybD4_4YnoGm7d5upZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaOjs2sqjzkdl3Se54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxg69DfuxFIvt3HRZR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmcM92dDg-os39wDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPEXKkYRAmsL14Hfl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]