Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anything you say to an ai is being given to a corporation. Do not trust them. It…
ytc_UgxiGJ7G_…
G
Not sure if AI is anywhere close to taking healthcare jobs. It is however becomi…
ytc_UgzfhC7yu…
G
What AI service have they been using for this video, and the last video? Anyone …
ytc_UgwK3Saui…
G
Intéressant ce que tu dis. Maintenant, c’est quoi le libre arbitre? On n’est pas…
ytr_UgyvMq8z6…
G
So, Roman, if you’re 100% sure we live in a simulation, then why worry so much a…
ytc_UgxOeyG6F…
G
Not to do with social score or anything, but most of my university lectures took…
rdc_eepiq7b
G
What are they talking about? Why are they not acknowledging that they (and we) h…
ytc_UgzRLgvTq…
G
I’m legit scared for the future. Imagine having this thing walking your house. I…
ytc_UgyxUfqkZ…
Comment
I recently dealt with some code and asked Grok/Claude for help. Over and over (and over), it looked at the supplied code and said things like "The bug is right here...", "The issue is obvious...", or "Clearly, this is incorrect...", yet the "corrections" did not fix the bug, or worse, introduced new ones. I had to keep correcting it, forcing it to focus in on the exact issue, and only then did it "see" it. My opinion is that it takes the skills of a software engineer to know what and how to ask. Managers who believe that they can eliminate coders and just ask Grok/whatever themselves, are in for a surprise...
youtube
AI Jobs
2026-03-10T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyVcQ5cmAL3fb8ij_t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzr02reWKGQltTRV_94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSQcSjCvFd54xhyeV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9SZQzLNAffawvmO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzsLODuJQJoRxUi85p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9waY-DD8Jam4sq4t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxTCJT-Tu3U48vJzOJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyM4SQcbRNZhZkKyQN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwuf_g9RZT7zCST1E54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnZ4s_0wysjZ0bi614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]