Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My condolences. What a tragedy. Chat bud AI legal? What a coincidence. My grandd…
ytc_UgxTKGZle…
G
Fair point, but the calculator analogy breaks down, calculators solve problems w…
ytr_Ugx6el1ws…
G
Comparing gpt 4 to gpt 5 is unfair. You should compare it to the last model of o…
ytc_UgwUSoZ6Y…
G
the only way to stop ai is if the boss thinks one fifth of pay is more than one …
ytc_UgyhrLT1Q…
G
Please hear me when I tell you this. You can have all the bells and whistles, ga…
ytc_UgwAzoz5n…
G
“i do all my work in 2 hours and its all through AI apps.” that sounds like abso…
ytc_Ugzuakxmp…
G
From my vantage point, companies do not want to train entry level developers so …
ytc_UgzPuAazx…
G
I hate ai, because it is not work. There is no actual work, no passion behind it…
ytc_Ugz91kgrz…
Comment
Before the recent tech surge we made machines that solved problems. Now with general ML/AI and humanoid robots we are making machines that then look for problems to solve instead of designing specific use case devices and expanding on them after.
I think there's something to be discussed about why general use devices are trendy when they haven't historically provided more value over specific use case ones.
youtube
AI Moral Status
2025-10-30T20:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdWB2GvyUuqIVlCi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybtjBUk39J3illv054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6W6lYH1D8Uj9Bwxl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLmQDK4VS0RkkLAUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw83iGH3FmGlHOpS314AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw5gyINpG8jmJV9s6V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzelWm4EbPVk114lMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyk7e-1BrjucVChMBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBdApmyz7dTqviZ154AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz590g8tnUELebYGlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]