Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DanknDerpyGamer I mean in this instance, not AI overall. You can use AI ethical…
ytr_UgwC-s01X…
G
BAD A.I = DEATH
GOOD A.I = CHANCE OF SUCCESS / CHANCE OF DEATH!... do these soun…
ytc_UgyxRkaHv…
G
@ right, I’m not saying that it’s a good thing either, just poking at those “AI …
ytr_UgzkwSlo0…
G
I hate the argument of "it frees up time" the most. Wasn't the whole point of au…
ytc_Ugy2KXqky…
G
A base for all ai development like Google Android apple and Windows
.
Provide a…
ytc_UgzzcEP1Y…
G
We know what They want but if we continue buying the new car, new phone, new vac…
ytc_Ugx-Qt-L_…
G
AI is a bubble - The technology is so far from prime time. Close to zero compani…
ytc_UgyQ-QHiX…
G
Most of us know exactly what's coming it's the rest of the people in this countr…
ytc_Ugy42MfiL…
Comment
And me currently in my second year in CS not knowing whether or not there will be any jobs left by the time i graduate with the ai takeover, and the oversatured tech job market. I am really passionate about software engineering, but i am considering dropping out to find another domain where the risk of ai automation is lower, because i have to retire my parents as their only child.
youtube
AI Jobs
2026-02-04T22:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjrY-eKj3MAv6qv_d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxEvdJSigJ47QjkTt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxlccscRJajYfTxAUV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwkJfYO4zkHBxan5xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfxyI01iwItfDyMa14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqPxheQhAFEmoFUYF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyAsEAKP9AGcgY1O0h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwvfw8a6OBK-51PEz54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy4DJEOhnIRr8PZHE54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw-qoUJYC67fnxZeah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]