Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
imagine wasting trillions of $ into the A.I nonsense but also you know it's very…
ytc_UgxAqfM7o…
G
I've always found it strange that the most supposed smart people are the most st…
ytc_UgyXDMwY0…
G
We’re losing jobs to AI because that’s where companies are investing heavily, no…
ytc_UgyhulWzZ…
G
AI can't wipe it's own ass "hypothetically". "Trans-formative" will turn into a…
ytc_Ugxieig7L…
G
Tax "robot labour"? Is this serious?
What stops companies from installing the r…
ytr_Ugy_Q_27v…
G
I support Universal Basic Income 1000 percent. It' very uncertain out here. Peop…
ytc_UgyA5xKdR…
G
Just wait until robot apocalypse! THE HUMANS ARE GETTING LAZY BECAUSE OF THE ROB…
ytr_UgzUeVN_v…
G
"Bias testing!" Right! Joseph Googlebbels designing algorithms to censor anyone …
ytc_Ugzcppe44…
Comment
Saying that self-driving cars are 'evil' because they will 'lead to loss of jobs' is as stupid as saying that cars are 'evil', because they will replace horses.
Or modern mechanized agriculture is 'evil', because there will be less 'peasants' in fields.
Or robots inside factories are 'evil', because they reduce the amount of factory workers to abuse and to drive to jump out of roof.
It is saddening to see growing Ludditism among american left...
youtube
AI Jobs
2025-06-21T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzpuUJAI6qkHpwHTR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnXs2n7EnUFRKuWjN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxhaNzJeMMKnIUibTx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw-GQ5cnjnXEM1ovoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyiQlzw7hXBZl9_HhJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxkGexVrCU6Hz6uUP54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw_vLwMLRY4Browxx54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUdLo40hGkD9Sihkt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwW-MoG--D-dJzUpnx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZBBlYM_XTExjHsJ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]