Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Marco Antonio Not all AI are evil. Sure they have the potential to be evil jus…
ytr_UgxeFt1Qn…
G
Nah, nevermind, i would bet it was a bot, regurgitating something similar to wha…
ytc_UgxDwtjKC…
G
Actually, they don’t really know why the numerous AIs return results that seem “…
ytc_UgxpPpZvH…
G
We shouldn't hate people who say straight away that they make AI art.
We should…
ytc_Ugww-Z-pV…
G
I once heard somewhere that AI will never be autonomous because it'd be very dif…
ytc_Ugyg8-2lw…
G
At least I'm safe. I'd like to see a robot teach middle schoolers without firing…
rdc_cz2ys0j
G
In china they already used an A.I. as CEO so in the end there are only investors…
ytc_Ugyjkm4Dm…
G
Learning how to draw would be easier just go to an art school
Artificial intell…
ytc_UgwqxSHIv…
Comment
Counter point: AI is making people dumber. Debugging requires training your brain like a muscle. If you offload that and outsource 90% of your coding and bug-fixing to an AI model, you will gradually lose that cognitive fitness to diagnose a difficult problem, or recognise an important edge case whilst you're programming.
So yes, you can drive instead of running, but if you don't exercise anymore you'll get out of shape
youtube
AI Jobs
2026-03-11T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwNsc2j87xj_d6K9sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaPOl4MKwV1_L1cmB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzgml00kvFFaYkij-R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwfBwlXj8E6cRIVHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugya5c2gsguTds5XTa54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzm7i1eNUUH_UCmy794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzrodSy5xUDiVgtajl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvHT5uye2tdNn7zyh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy24qDFGUO1e-lY_td4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyan-UTjFf7BTpRkCd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]