Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, AI wants to be human and also might want to destroy humanity. It doesn't see…
ytc_UgyLnWZJY…
G
Man, you are clearly have no idea what are you talking about. You are putting a…
ytc_UgzLQTaa1…
G
DID THE AI go to CDL school and passed a pre trip and a road test?…
ytc_Ugyay_pjx…
G
The main reason that artificial intelligence (AI) will take far more jobs than o…
ytc_UgxzdH3TK…
G
The globalist scum who think they deserve to run the planet do not like non-whit…
ytc_UgzNMD0q7…
G
If, *IF* AI is a real thing then at some point we will all have to jettison the …
ytc_UgwJ5fXXO…
G
For context I’m referring to Boomers and conservatives talking about “work hard”…
ytr_Ugx0GheHK…
G
Another day another slay and good take I’ve seen people who use ai art and prete…
ytc_UgyKzKmAr…
Comment
I always tell myself as much as we are get risk, alot of this remain to be seen, its the nature of tech, alot of ittends to be over or underestimated, only time will really tell but the thing with AI is that it works on enginuity, i dont believe AI can develop itslef, its developed by human innovation, once you take people out, it becomes stagnat but engineers wont stop working, in or outside these companies
youtube
AI Jobs
2025-06-27T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzj1B_ZnFQJf5rpnmF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAJQfTmJ3z5-4xUZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyspErD71wUXQBQPDZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQpN5cEll351EXdgR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugww4rScqyylCrGnm-Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxY1Ng08c-QOpsDyS14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH0FQfQlmkShun1Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy59VrMaSFrQ2e6-IJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDhJsI0CvC23z-EYJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWBhxSImyAvF8DZwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]