Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes AI is taking our jobs , (AI stands for Actually Indians btw not artificial …
ytc_Ugw2Flgj4…
G
Musk warning against AI and at the same time developing it .
We're in trouble.
…
ytc_Ugxpj6MJG…
G
"I need to go have dinner with my family"
"Is your family more interesting than …
ytc_Ugz5JU8Kg…
G
So basically Will Smith made a fantastic movie about what could happen with AI a…
ytc_UgwG990Z3…
G
4:03 sounds like some of us humans.
i mean some of us do the same thing.
not t…
ytc_UgxYCX18u…
G
Lex.Fridman has super guests on AI and other STEM subjects. Utube available. You…
ytr_UgyfIOcfg…
G
I just don't want the AI to be mad at me. List Chat, I didn't even mean to click…
ytc_Ugzvygy7s…
G
You see these robot arms are used in one of those Harry Porter rides in Universa…
ytc_UgyMkTvBI…
Comment
99% by 2030 is fearmongering. That’s such an insane number, Manuel jobs and service jobs and social jobs will still be needed because a lot of people will still WANT to interact with human beings. They will reject full artificial replacement. It’s white collar jobs that will be devastated, not blue collar or service jobs. And new niches of work will be created. AI tools for film and animation will only go so far, because we already have a backlog is so much great media that AI slop won’t be tolerated. Already, unless it’s a simple meme, I won’t click on any video I suspect to be ai generated. It’s toxic ooze. Regurgitated garbage. On principle I avoid it and I think a lot of others will too.
youtube
AI Governance
2025-10-02T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjRvTKxI5hZdgdAGB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxjeFwYxfrSH41DszF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRNocyNt6JD-owuk94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxbtz1H3Rk-58bczEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxat-HqrYf-tfgRoQR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJ-1OhTssgjIM1N8l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzaplful2N0bjm_wJx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx_60cxnFfaDH-lYJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzfITiaZWeR3BMDuUx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxOOabu-7dVyacjpL94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]