Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai detectors are literally ass. I completely did an essay with no ai and it was …
ytr_UgyvyU_tC…
G
My question of the auto pilot programmers is this . At some point the auto pilot…
ytc_Ugzx32_wx…
G
What you mean is you'll be doing the same jobs once they realise actually ai is …
ytc_UgxLPKoOD…
G
SOMEONE AT MY COLLEGE SEARCHED UP CHARACTER AI ON A SCHOOL COMPUTER FOR FUN 💀…
ytc_UgwNvkyhM…
G
AI 🤣 we are nearly there from inflation already 😂 I pay 750 in child support a m…
ytc_UgwXSOEHz…
G
Lo habrán enviado al psicólogo? Para el manejo de control de irá y unas clases d…
ytc_Ugw1gq8pG…
G
Way way off on it. Ai technology and quantum tech will exponentially grow. It Is…
ytc_UgzGPb8GA…
G
It's so frustrating knowing that the main reason people want driverless cars is …
ytc_Ugxu88lil…
Comment
I’ve never really understood the fear.
AI isn’t different from past waves of automation — it just eliminates more advanced tasks.
And that’s a good thing.
It forces engineers to focus on what actually matters: defining the right requirements and delivering outcomes. Something they should have been doing all along
The future engineer isn’t a ticket taker — they’re a product shaper
youtube
AI Jobs
2026-02-22T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWzNzuf4D6w-obff54AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLiULpsmE1ddxYipF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8axyL21Wd7BJ0q2J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyf0b3Dp6GYmDPVsPd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSbBBT6WJ2SNSyYFJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIYpPLNTIDm9240KN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMRgQrg97Dxe7CwBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcmG53f2gxFTZQRu54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEzvEtxfk0m8tJ1714AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4wIcXBcFwPlSENFx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]