Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I honestly think AI is better programmed for socialist outcomes rather than capi…
ytc_Ugz1Snx7l…
G
@drjoshcsimmons its only going to get worse from here though.
did you see the w…
ytr_UgzPBQ1LL…
G
> Never in human history had we defeated a contagious illness through herd im…
rdc_g9tyszb
G
I put the text of your teacher’s email into https://detecting-ai.com. Yes, I kno…
rdc_kgtf5s0
G
Now I don't want to stir up any drama, but *Best* Korea still has a manned space…
rdc_cjoq86o
G
If this could be fixed or wanted to be fixed it would have been already. To prov…
ytc_Ugz6qQ0VJ…
G
34:42 He's calling Elon musk. the one person who've been warning about AI going …
ytc_Ugx6WoPVJ…
G
I asked ChatGPT and it said and I quote: the video is misleading or edited for s…
ytc_UgxbxULwh…
Comment
Great video! I agree that AI is a tool rather than a replacement, and coding will still be valuable in the long run. However, I get the idea of not blindly relying on AI, but saying ‘don’t let AI write code for you’ seems outdated. The reality is, most developers already use Stack Overflow, blogs, articles, old source code, documentation, and tutorials—which is exactly what AI is trained on. Why waste time manually searching when AI can provide the same information in seconds? Wouldn’t the smarter approach be learning how to use AI effectively, by strengthening your programming fundamentals and getting good enough to be able to read, comprehend and modify the code AI generates for you rather than avoiding it altogether?
youtube
AI Jobs
2025-03-08T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy5QtoZPL3YtqJyEkN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyeigm13ArDgwC8yiN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTyKaGL3DYymPOFv94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwp_END8zTZ2T972rV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmRhrMBz8c4MheJpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwr5dmQ8mHLlg8ZOgR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyF6UwnoOEtdQD4aTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydFDH7_yatCYaVrel4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgynXOBY0tFyIhWDNK14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiY9AIIxrDmKQUYV54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]