Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is what I'm talking about the robot apocalypse straight up bro they're figh…
ytc_UgyFMgVEf…
G
Nobody knows what would be the future because nobody knows where the AI will be.…
ytc_UgxKls3te…
G
To those who don't have functioning neurons in their brain....... AI is not an a…
ytc_UgzuRmou7…
G
"AI 'art' allows the wealthy to access a skill while preventing the skilled from…
ytc_UgwAYSmhK…
G
@AsianDadEnergy The brain isn’t only a prediction machine, but every output from…
ytr_Ugz99QFA7…
G
True and it's still painful to remove background from a pic to make a perfect pa…
rdc_n7yolgh
G
@bradweir3085 harming openai by assassinating someone who was about to testify a…
ytr_UgyypQpm3…
G
In the end it all goes to fundamental problem of missuse of technology. So the p…
ytc_Ugxs32GfF…
Comment
The idea of not having developers there that know the ins and outs of a codebase is the stuff of nightmares for any programmer. It'll work at first for awhile but something will break and something will go wrong and if you you used A.I to make everything you will have zero clue how to fix it at all. If you need to add specific changes or upgrades you're screwed. Even if there's A.I generated documentation. Having millions and millions of dollars riding on something that can fly off the rails with no ability to stop it is what we are looking at.
youtube
AI Jobs
2026-02-06T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgybcnRXnzKODOcDA5R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxghy9C8n0QtmVYolt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3kiAvAgk6W-xX_rl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxK2XDiSTlY0v0wPFx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyAutQLxGZWCefkXAB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz5voaIEUF07yRv--B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx7m1xorNA3ZBpYBAV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykhDs1K5t4RxzP7Ox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFbsjWnbWegGMe0d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxt8HLhvgUzOpGDBwx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"})