Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is this actually a bad thing for liberals though? Seems like MAGA get's easily r…
rdc_ohtxlld
G
The best AI image use case I think is for obscure stock photos for business to b…
ytc_Ugy6vEZF1…
G
that is incompetence, developer know that but they avoid the fix, just a few ins…
ytc_Ugxxyq92f…
G
In the digital realm where algorithms thrive,
A forecast unfolds, where technol…
ytc_UgyqWYsMG…
G
I think we're all mistaken about how AI steals dev jobs. No it won't come to wor…
ytc_UgwszxMbF…
G
It is better for a human to make something bad than AI to make something good.…
ytr_UgwsYoayL…
G
I think I caught Dave lying about the responses from ChatGPT, in min 24:00 an AI…
ytc_UgzHy5KNg…
G
I think we're already starting to seeing this. Not in the sense that AI is going…
rdc_mva8bhd
Comment
I am only 6 minutes in, but so far you are ignoring the elephant in the room. The same elephant everyone likes to ignore. AI has no agency! AI doesn't WANT anything. It has no aspiration and it isn't plotting unless someone told it to. It does nothing when not being utilized. It isn't stewing or plotting. As soon as anyone makes an AI that has agency it will be a HUGE leap from where we are at and then all this makes sense. NOW, however, it is just click bait crap
youtube
AI Moral Status
2025-10-31T15:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugznx6Vrfa_ILXDDAmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIzZsIk9hou_DkG5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxuC2lR1DcVZvxeph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2WvPg2zwHagKEc_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw29TXfU1-C6sJ4Iv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhFUeHflYZB26QLxF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwP_OwAJj7ACUAxfkV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziSIhT7JSsVAbovId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxG34lc0Pl01TyzbH94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpiCz-nk2S8FTrSet4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})