Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
4:54 THIS. I cannot get the thought out of my head that all of this AI bullshit …
ytc_UgwrpRxPZ…
G
AI is changing the world but if we lose access to internet and go to old era of …
ytc_UgyUYCaKw…
G
He's jealous users don't love anymore giving their data away to the US in exchan…
rdc_m9gf5ov
G
If AI fks up your computer by generating faulty copy and paste code, who do you …
ytc_UgykeLwZv…
G
Humans: hands are hard to draw when you are a begginer but groeing up improving …
ytc_UgzIIx-CS…
G
So, you really need to talk to an engineer about AI. It is not "getting better e…
ytc_UgxslErbh…
G
Most individuals are capable of acting within personal responsibility in an ethi…
rdc_gtfzzt1
G
@michaelwhitman1247against an robot who probably has 10x the reflexes, speed and…
ytr_UgykTUzII…
Comment
What that means is your niece isn't needed. Once AI can do the job of healthcare workers people will be obsolete and companies can hike the cost so that only the elite have access. This is a very rose glasses view, but that is not reality. Humans will only become more evil. Humans program, until AI becomes self sustaining. Then humans are not needed. In fact humans are the threat. Not only to AI but to the earth at large. Extermination will have to begin if AI is more intelligent than humans.
youtube
Cross-Cultural
2026-02-06T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzKfpzMmHQo9tLg0pF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzM8dDiIWsB7qL-29h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpcDdAPyQzmfOmFf94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8OYPBU1n7B1TRADZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcKJs0wMGUoVXmb2x4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgWWZIFfdhwNBArkl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNE41cBURNwJ4Vlt54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzDtO1am7tYKzm0HPt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj8jEMNLCHWGFcAUt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOo4679r4C8loVMol4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]