Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't mind ai art for a quick Xbox live avatar or whatever, but goddamn try lo…
ytc_Ugxx_Z98u…
G
Multiple militaries have already weaponized AI. Jobs will not be necessary, huma…
ytc_UgyCWUwVY…
G
What if i put greese or oil in the path of this robot when its chasing me 😂…
ytc_Ugx8jA1ie…
G
I think I should be in control of these "emotions".
Sometimes you want a paragon…
ytc_Ugwq_lsho…
G
art in itself is human made. ai generated is basically programming with less ste…
ytc_UgxTSTP0l…
G
we're about to pass the peak of inflated expectations. people are starting to se…
ytc_Ugxj4yQOI…
G
Fun fact, AI doesn't recognize how to properly create human hands in AI art. It …
ytc_UgxFlsufC…
G
I use ChatGPT as a tool to narrow stuff down, basically to find out what I shoul…
ytr_Ugxnv-cXD…
Comment
1:22 the more you frame your high level understanding of LLMs from the mathematical, statistical view, the more you see the current problems as addressable from an engineering perspective. If you just hear convincing arguments and you don't go learn what is necessary to judge whether the thing you read has a basis in reality and is not guided by the if-it-bleeds-it-leads scare you into buying their book mentality, then you are at best doing yourself a disservice and at worst engaging in motivated reasoning.
youtube
AI Moral Status
2025-10-31T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugznx6Vrfa_ILXDDAmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIzZsIk9hou_DkG5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxuC2lR1DcVZvxeph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2WvPg2zwHagKEc_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw29TXfU1-C6sJ4Iv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhFUeHflYZB26QLxF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP_OwAJj7ACUAxfkV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziSIhT7JSsVAbovId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxG34lc0Pl01TyzbH94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpiCz-nk2S8FTrSet4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})