Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They are literally already collecting all the data in our phones and so many of …
ytc_UgzN3_N5V…
G
I clicked just because of angel engine thumbnail and somehow you managed to neve…
ytc_Ugz6fA4nN…
G
Artists who do work based on previous works or replicate styles as studies GIVE …
ytc_UgwnZz8l7…
G
Ai learned from us the knowledge we have typed
We are racist, and we are not …
ytc_Ugz-ldr2U…
G
Met a regional Microsoft office higher up guy. He mentioned last month they laid…
ytc_Ugwtwcbtt…
G
I would NEVER ride in an Automated Vehicle.. Somethings just require the human f…
ytc_UgzcBrOAR…
G
Believing a chatbot can become intelligent is like believing once graphics are a…
ytc_UgxtcGSSb…
G
Stop watching YouTube theories and read Selwyn Raithe's book . The author connec…
ytc_UgxDfmSkG…
Comment
Meh. Eric Schmidt might have a vision, that doesn’t make it correct. Specific to programmers though: “What’s the language you program in… it doesn’t matter” ...sure, if all you have is a hammer, everything becomes a nail. But it’s not that simple or black and white in every case. Ten years of deep, hardened problem-solving in a specific low-level language shapes how you think, not just what you produce.
AI is trained on documented patterns and solutions. It can generalize impressively, but that doesn’t make long-earned, experiential expertise irrelevant or interchangeable, much like autonomous driving changes driving, without truly replacing drivers.
youtube
2026-02-10T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOoIYnrlV9T19-hpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypoEhceTLLOW9ji3N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5cDA3X3PoPpMBBx94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugz7Fd5He23XqCbGr7R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgPbH_SDIyvjlhTNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugwf07kJ-EHQs0xyHIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfZdgEUl-OYBujSkt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFCi-m_aO29V5FcX14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3tcBdSOw1eR8DaOp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAgpIWNhFRDyiaJt54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]