Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But that's why he mentions that companies that sell "human stuff" (like Walmart)…
ytr_UgwRDA0c9…
G
So robots can automatically be connected through “thoughts” .. Americans can’t e…
ytc_Ugwp80hme…
G
People will develop a human destroying AI and this clip will be played after the…
ytc_UgwwKDOS2…
G
"AI 2027" is fan-fiction written buy a bunch of nerds that totally overestimate …
ytc_Ugydd4pFL…
G
Ya cause one day the robot will take over the world so respect robots or you’ll …
ytc_UgwYR9O-V…
G
Not necessarily. Lots of people make their code open source. Guess they would te…
rdc_j6ebdmn
G
This is what happens when you have woke ChatGPT coders throwing this issues into…
ytc_UgzT9d4Yv…
G
@genericname2747
Why does AI stop you from creating? This doesn't make any sen…
ytr_UgxqR5pC1…
Comment
Oh it will be by 2035 AI will be everywhere and it will be in everything to keep our lives nice and simple as for the working class good luck 👍 Your kids and their children are so fucked and it doesn't stop at the working class 👍☺️
youtube
AI Jobs
2025-10-08T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgztqD-BQLY_PnUiHpt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgywF2N8AHDFGt4fc_p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxOCFlGU8WSNZ_dIVB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxjFLE2xDGgNet9aad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCZCUHFiCSByqtQfh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymclC97tcpOLd44it4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz13Kr-slYVSypqYVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzVB3wdj2TFlMICiPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxpa1mkB7DVZOi51Qh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyzizNKJS8VZymvGbV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]