Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know most people hate AI atm, but it's the future one way or the other. Our en…
ytc_UgypSGY4B…
G
He turned a video about AI ART into an Advertisement of his website xd i don't m…
ytc_UgyU8Fgv0…
G
The best thing I could come up with is dependent on the level of autonomy the ro…
ytc_Ugj1f08yN…
G
To add to the AI shit, I've recently received an AI generated query for a childr…
ytc_Ugz3Vq7Y8…
G
You gotta look at A.I. like this as if its a severly logic based autistic person…
ytc_Ugz2HPNgW…
G
That's an interesting perspective! The name "Sophia," meaning wisdom, certainly …
ytr_UgxcabgAD…
G
I found many YouTubers full of misinformation and click bait and irrelevant info…
rdc_jj9rid8
G
You told an LLM to roleplay, and it did. Then you reacted as if the LLM had the…
ytc_UgwNGGrYR…
Comment
I have been using AI for years now and it will never replace good developers. The code these AI Models produce is dependent on a lot of things and their big weakness is libraries, frameworks that have a lot of changes in a short period of time. As the models use statistical analysis to produce results, new releases of libraries do not have the historical information for the Models to (1) recommend (2) produce the examples of the changes to the library (3) have the code correct for the edge cases / fall throughs. And due to the moving window for the prompt and the context, if you have a very large design submitted to the model, not all the bits and pieces are stitched together correctly.
And when it comes to requirements and design work, these people will be in demand.
youtube
AI Governance
2025-07-15T02:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzPq3nqDV53YPl-zXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz8aGdK5dLTprIv214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGa0GPiYnOxHNnBOB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzRpNx5it1_saM0_zR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxqjyHMMmxdb6V_SFN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkeOm5kO_AS3tWORB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugwrq7k8maFQvlnsvYd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvTI4jY0kPWjJOcP94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwOqyL2Io52TOCKcxd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwpBLOoNqtvZnUUGLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]