Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Basically talking about replacing some human jobs with robots. But if they were …
ytr_UgxaA2RDq…
G
It's not robot soldiers, it's just like missles and other dangerous weaponry mad…
ytr_UgyUsy0gH…
G
3:36 oh umm don't they get money ya so just hire actual artists to paint picture…
ytc_UgwhaaHqN…
G
This is a good precedent from disney or any big corporation spewing out AI gener…
ytc_UgxAH-gIs…
G
I think the salient point here is that given the two options, if AI takes over t…
ytc_UgxzhGBmj…
G
They should make it so that people who support ai art and similar things can onl…
ytc_UgwQ3tj_I…
G
AI is far more dangerous! If anyone does not speak the real Dangers of using AI …
ytc_UgwFmOzPL…
G
This problem came from people not distinguishing LLMs generative programs from t…
ytr_UgwzagCkW…
Comment
This is a serious wake-up call! 😱 Hearing that millions of jobs (like nurses, accountants, and truck drivers) could be replaced by AI in the next decade is scary [06:12]. Senator Sanders makes a good point: these companies are pushing for robots just to cut labor costs and make billionaires even richer. We need to talk about who AI really benefits. 🤖📉
#AIandJobs #WorkingClass #FutureOfWork
youtube
AI Jobs
2025-10-09T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgySBE78b46nVSeK23B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPfsajtjN9ERYdph94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCMFXtPwtC0sF9T6h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVCZHeR_PHcfm5U854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwZHSe_ukmvhkQuUDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBMdrEg7_E3dooT3F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy96vU6Al_Lg8IOcFB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNMMuS_sfQNjbSThh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzWrPQQH0_kiGQYufR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzqb7eubDWrr8kbO3R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]