Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Does not take much effort to create obstructions especially for a lousy ai bot s…
ytc_UgzfvLXjl…
G
The goal is to teach. If a job can be done with an ai, make the job a trade scho…
ytc_UgwmmDw7_…
G
If the developer doesn't understand their code, how can you expect the AI to und…
ytc_UgzDoEbvd…
G
YouTube lets you ask AI any question you want under every video, maybe that's on…
ytr_UgzhSgxMC…
G
Its not AI's fault or the developer's, the facial alignments of black people and…
ytc_Ugx9OP3OA…
G
There is someting worse then AI...It has
already affected man's history,it is ca…
ytc_Ugy-4en-k…
G
Its actually not the AI whats completely dangerous but the people who will opera…
ytc_Ugy_CRD8U…
G
The ethical problems and the moral dilemma are separate problems imo. You do tal…
ytc_UgwOtjLjU…
Comment
Rage against the machine never helped anyone ever. Automation has been replacing human labor for more than a century and guess what, the high standards of living we enjoy today are the result. People always found new jobs, more productive and better paying in a lot of cases. And if you rage against the machine and insist on human labor while your competition does not, you will fail anyway cause your products can't compete. This is just another billionare bad video playing on human envy.
youtube
AI Jobs
2025-10-10T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzm90Qk91fFosRiQAV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw5CGd6AwEja0Md4BZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvSrWbQKZwPAyqODt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwo60XguVOy_lwqXP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynFAwjB4AdPieVh1h4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvKeO2hOrTXbV_GmJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzTiRf9I2GgI1naN5t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyjkx9XpQ5gIsDYJT54AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZitnwv7Y4SF8MeIl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQaAJb3Rcu6wkSfUh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]