Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The very premise that ALL "AI" are functionally identical is deeply flawed.
AI h…
ytc_UgxfEJUg7…
G
For free or for up to $200 per month, AI will flatter you, then convince you, an…
ytc_UgzEGp4jl…
G
A lot of people would end up on universal income and migrate to developing natio…
ytc_Ugw2k4dDw…
G
It's funny, this is such a uniquely American view on this.
Like, if food produc…
ytc_Ugz_qIlUt…
G
Haha, great observation! The robot's "sweating" is a playful way to show its res…
ytr_UgynSeGvk…
G
Great interview. And I'm no Tucker Carlson fan.
If allowed the freedom, AI can…
ytc_UgwKpgLhD…
G
Not AI but greedy tech giants.
So much attention and investment in AI compared …
ytc_UgyN15xD_…
G
@fredumstadt593 thank you for your point! but I think there’s more to it when we…
ytr_UgyVLgHho…
Comment
Why do people create robot and AI? because we need labour to do intensive and dangerous things that would cost billions if done by human. Human need salary and other benefits. If we give rights to AI, then their existence doesn't resolve anything. They would demand equals compensation as human. The cycle would repeat itself, until human get wiped out of civilization. Pathetic isn't it? losing to our own creation.
youtube
AI Moral Status
2020-07-08T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxETV1MDiKHFMDczol4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwyql58qzAdMjrxuTd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwT2ROP1DdiiDKGB6l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx10SywtLayObwrqHJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxskrXQC6_fIBlsuDR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxpi_ocEZKOU8fp3oF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxeEqy56N4G3NSRKx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7aG3aZkGEfL0ueo14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwhabjBBTmtZ8xk_bR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxnqib2a0o5Oe0zda94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]