Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most scariest reality that mankind created AI will surely get us all in the …
ytc_Ugw6_XpIf…
G
the United States of America wants to win China by decoupling a better and smart…
ytc_Ugw9L-p5Q…
G
7:10 Whoa there my guy, did you just say you can accurately distinguish between …
ytc_Ugx5xFKsP…
G
Same, I like art made from humans because I know what goes into making art. I al…
ytr_UgxAYEmUR…
G
I agree ai should be stopped now. We don't need it. It makes us lazy. It's to…
ytc_Ugz4xGXFs…
G
So I work for healthcare services, we are to identify AI calls and end them. Thi…
ytc_UgxFkudQn…
G
AI art only looks cool, if you zoom in.. it's just a bunch of mess.…
ytc_UgznU1dou…
G
Intelligence is already there. Physics is following intelligence without any dev…
ytc_UgxCG4cyK…
Comment
We’re asking the wrong question about AI.
It’s not: “What happens when AI takes our jobs?”
It’s: "Why did we ever confuse jobs with human worth?" That wasn’t truth, it was conditioning.
The uncomfortable part:
AI won’t remove purpose.
It will expose how few people ever had to define it themselves.
Most people didn’t lose meaning,
they outsourced it. So when the structure disappears, they don’t know who they are without it.
That’s not a tech problem.
That’s a human one.
youtube
Cross-Cultural
2026-03-22T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwyIK6jOoioSzTWkKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMq0GK5SKr5_FBIVF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykWs7dvUCP9ekheGF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKH-XdvdBts5t5_Qh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWsbCnnkOidK_XyTx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgmNTP8_rRyAUfeu54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzggtr1JGVR4kLPv-54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxy58XTNqqIhxpc5Bp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLzoQuAOqamm6KQiR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzO9Tc6pH87URO19Lx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]