Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.i is. Cool instead of years wating we. Can. Have it in a few months of new sea…
ytc_UgzlzM8ev…
G
It still steals from artists, this is to avoid their future art to be stolen jus…
ytr_UgwYSdfuD…
G
I hate when these two discuss AI. They don't have any real background in techno…
ytc_UgydApB1w…
G
We are all slaves to a system designed to break us down into smaller divided par…
ytr_UgyhKz-i9…
G
@moreplease97nope. Survival is a sub-goal of whatever primary goals we give the …
ytr_Ugx2SQ-wm…
G
AI has been created by collective human intelligence….
Human embodiment, expert…
ytc_UgyUCdqpK…
G
probably behave better than the current global elites to be honest. AI does not …
ytr_Ugxcr0IOr…
G
There are way too many daily deaths on the roads, if you do not believe that aut…
ytc_Ugysr0pol…
Comment
i mean i get the point, that losing a job sucks, what then why should you prevent these advancements? There are okay'ish ideas in the video, like a human driving a lead truck to keep humans in this industry, but being against autonomous vehicles just for the sake of keeping workers (which are more expensive) does not seem like a good idea to me. I feel like a lot of the anger comes from people being like "its not about solving the driver shortage, its about saving money", but this does not really invalided the actual idea in my opinion.
youtube
AI Jobs
2025-05-29T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxEviwPgcwh1JBUJVF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQyQjkj51CvUbVrMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyNmL-Isi67M8yaqR14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwZgHqbNYED8i-7dIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzqbcO28bsg2enpNp54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyv6xhu4kTx5U-SOJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzinYIa1LnCGF2cmLp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQjTOI_dH5p93lH094AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9xIzUzeIC76f1not4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0FkNjQDAAmvLZQch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]