Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s not abusing technology. AI can be extremely helpful, but if we’re using it …
ytr_Ugx7OO_HI…
G
The biggest problem with AI is that it is so human, in the sense that it is expo…
ytc_UgzQVlbZ2…
G
This is an AI that replaces business development representatives, not AI enginee…
ytr_UgwD9mBUa…
G
"But my company is special. We have the most talented engineers and guardrails n…
ytc_Ugwwr9Hz-…
G
reminds me of two complete losers talking to each other with about an intelligen…
ytc_UgyqneIFk…
G
So everyone will be out of a job in a few years. Just like machines put everyone…
ytc_UgxXNbcve…
G
A great example of how scientists or researchers can manipulate data for a desir…
rdc_f9cjsf6
G
i dont go on the road or own a car anymore. i do not want to die from some drive…
ytc_Ugx1d98ng…
Comment
Oh please. Ai is so overhyped it’s ridiculous. There won’t be 5 remaining jobs. You can’t replace blue collar workers completely. Robots will need to be built, repaired, maintained ect. Ai will take a lot of menial, repetitive work. That’s it.
Until a robot gets the dexterity of a humans touch, the ability to move , bend, flex, and quickly adapt like humans do ect it isn’t happening.
Ai will take jobs that it can do WELL. That’s it. It will not take jobs that generally pays very well and can also mainly only be done by humans.
youtube
AI Jobs
2025-11-09T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwXD0_x9I1SAJzfeIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxF0fjcxn0BD7V0R4J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqHego48Qd1RhpTr94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5Xtg9LCYcaoFwrPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIZBLjEM_UJspboft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyV_TwyZJz2LVrOY_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzD0P_ZkML8KsOGCV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTlv776WM0wdeEsN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKcqWquHIGS8S_n154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGf6rhjvEZGTqqp_x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]