Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, we do need guardrails. Society needs competent workers, not people lounging …
ytr_Ugwjlz0bY…
G
I think this is only a temporary solution to the problem, since this technique o…
ytc_Ugy-48U1b…
G
Hinton is brilliant on AI safety, but his blind spot is governance. He criticize…
ytc_UgyOMvdeg…
G
@tspmofrr Not everyone wanting art wants to be an artist. Some just collect it, …
ytr_Ugzn_8OOU…
G
They are not stupid,
They are selling the bear skin before killing it, and what…
ytc_UgwlC2gFR…
G
“The freakiest website on the internet” buddy you have not even HEARD of Janitor…
ytc_Ugw_5Ksbm…
G
The reality is the company putting driverless vehicles on the road have very lit…
ytr_Ugyo4J5mI…
G
The real problem is not AI, it's the human ignorance and stupidity which is a di…
ytc_UgzW9lUc2…
Comment
Yes it’s true but those engineers never thought about secure code. Funnily enough the people who ask the AI to write code, aren’t asking it to write secure code either. There’s so much crap code I’ve had to review over the last year. Full of comments plucked straight out of Claude. To correct it the AI needs to be fed so much context. I found a bug in code in 30 seconds. Took Cursor 10 mins after some context.
youtube
AI Jobs
2025-10-14T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBkkPKpZ5BDHrtFkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXstq_h7PDG6DImp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_i5x54T0lBYpP1yF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTOE_se5t-3SxO6nN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwum9kYE_7_czplC9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI8DhZeM4sOtAKIAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW8ihcT_lWVXm792V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNF67saZItoMgk6f94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzYASTKzZPgxH5lRkV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjH-Db1y2-3dAA6l94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]