Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans will become so apathetic to any information, making AI technology the onl…
ytc_UgxLDShOG…
G
this comment gives me the same vibe of the commenter in another anti AI post tha…
ytc_UgwiNzBg8…
G
same guys that did the video of an atlas style robot refusing to shoot the "spot…
ytc_UgwmgOc-0…
G
Why no one is taking about human stop giving birth and slowly went extinct. Whil…
ytc_UgxsOsR-8…
G
Well assuming the cause is biased data sets, racism has been deep rooted for so …
ytc_UgzyMTNQJ…
G
People with massive fortunes to buy and integrate expensive technologies that di…
ytc_Ugwihutyi…
G
Let's make AI su**idal by design. Done and dusted. They're always on the brink o…
ytc_UgxPwfMJR…
G
@BaronBacon seeing people say nice looking ai art looks bad is like calling pret…
ytr_UgzKiEBEP…
Comment
Producing more code does not equal more skill. If we use that metric, then you start to reward slop and forget that great software is not software that has more LOC but one that can stand the test of time. There's a pretty good reason why an eng. with 20 yrs experience would be reluctant. They've seen what kind of software provides business value and which one does not as well as how important maintainability and understanding is in a codebse.
That said, AI is helpful when learning as long as you don't use it as the sole source.
youtube
Viral AI Reaction
2026-03-05T23:4…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwOZexr6ZUSxm1cNDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhpYfjwwVx5RODrEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhueAHQ7F8snrI5CZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrzxvNS39C4gLfFPR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIyUrZDtqIC04tPox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycIR4rGlV3wSC_C5l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzobKxIUck3ID-dOkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysBWv15LLCHyvJhWp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYL024bP3j6Drqo5t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6FwvbA1MCLvfgtaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]