Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is replacing a net total of zero jobs.
"Actually Indians" and record high rat…
ytc_Ugx39lKAX…
G
In real world this situation would never happen. Let say the only self-driving c…
ytc_Ugj96NpyN…
G
I don't understand this issue on the level that you obviously do, so I'm not sur…
ytc_UgyN1xHqi…
G
@zythehunter3692 bruh, the car is advertised to have automatic collision evasion…
ytr_UgxfXURaW…
G
Talking about rigourous adherence to truth while running the biggest misinformat…
ytc_UgzMSBxvN…
G
The AI determined that this young man was a threat to its continued survival so …
ytc_UgyNEMoPW…
G
I have no karma because I made an account just for this:
I am an autistic woman…
rdc_n7tzwk3
G
That's all fake!! Ain't nobody gonna lend someone there robot, cause he would be…
ytc_UgyQf5f7z…
Comment
How old is this interview? Junior developers are not at risk, and I'll tell you why. By the way, don't know who this guy is, but he's grifting. Anyway, junior developers are not at risk because LLMs don't get better because you have to fundamentally re-teach them for them to get better, Junior developers in a year or two become mid-level developers and by a 9th year become a senior developer. LLM does not evolve the same way, they have to be re-thought again and again for each level.
youtube
2026-01-10T11:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwmzKJERdwK9Yqt0HR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyt-clNjXXIYWKxR8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXExFFqRBqPKtkDGF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwIzuTl90IMyFkZQD94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdVkMGTV0Bm4DyzA14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOrRf5qPFFJFe8Sdh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7bgkhThXN1TrZUxR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxeuXWOh4rnAHozkN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHi4c65fIrU1Pccdt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8YmMWzsKGGt18QIt4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]