Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you’re a really talented speaker i don’t usually watch these talks but ai has ca…
ytc_UgxvH2leO…
G
They put the waymo only in San Francisco because they know people there will exc…
ytc_Ugwr4mycU…
G
Im gonna create art no matter what because I enjoy it. Only people that hate cre…
ytc_UgwiyS5O2…
G
The fact of the matter is that the first point about A.I. "stealing" is contradi…
ytc_Ugxf9yLjh…
G
> AI is the excuse
I can’t believe how often this is parroted here. As an ac…
rdc_oad47sw
G
In 2026 AI will drive the Economy like Germanwings straight into the side of a m…
ytc_UgxgfVM5b…
G
The internet didn't act as its own agent. AI can act as its own agent at singula…
ytc_UgxfkEKJK…
G
AI will also allow students to unmask the worst teachers for firing and therefor…
ytc_UgwZEqqvT…
Comment
I do believe LLMs can cost us our jobs. I had MANY instances where if I followed the solutions suggested by the AI, I would have wrecked Production and gotten fired. Or annoyed upper management with requests that would have gone unanswered and probably also shitcanned.
Often times it's very subtle errors where you need a lot of attention, diligence and expertise to even spot them. I don't trust them anymore because I'm not confident I'll spot them all. Sometimes it's because I didn't provide enough context, sometimes I did. But the inability to say is the worst.
youtube
2025-12-14T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQnNvnaRkOFAaHU6Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-ZOJPPwaIJZuVeRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6VNhnASYuNpw_AVt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxGLgYw0NMWvKIj8bd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugybik1FkbaGJfTDgy14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlSdo4-3AFcSFMgop4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkcRwQiLR-we2M_Cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyMEneHAowKXrS5Dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzX5IkPz0AaSkZ6UAZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTfSS7YQij7HB4b7Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}
]