Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would disagree that it's impossible for a LLM to be conscious, as we haven't y…
ytr_UgxnduCsu…
G
But the issue of token economics - just using cost per token and number of token…
ytc_Ugy5jXlP2…
G
Get it right! You're talking about bigotry not racism! It's a huge difference th…
ytc_UgyvppMRM…
G
HOW CAN YOU DOWNLOAD NIGHTSHADE, PLEASE TELL ME, BECAUSE I HAVE TO PUT A PICTURE…
ytc_UgxOktsgc…
G
Its super easy to create similar creepy responses. Preface it with "create a sho…
ytc_UgxOPp_SF…
G
. Do not need gun could we buy Ai How much everyone going to buy them it can liv…
ytc_UgyePQAr7…
G
It's impossible to have any system created by man immune to the faults of man.
…
rdc_j50y73q
G
never wanted such an AI assistant for privacy reason but I gotta admit, that's t…
ytc_UgyG0HmMv…
Comment
The first 25 minutes of this video left me with the impression of anthropomorphization of LLMs meets publicity interview to sell a book on an AI scare. I'm a programmer, I think I have a pretty good grasp on what AI is and is not. And I'm left feeling like I'm watching a sponsored video that's an ad pretending to be more than what it is. Maybe the video gets better later on, but I couldn't finish it. Sorry.
youtube
AI Moral Status
2025-10-30T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyABG2BqQo_bQ0RTeF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwKIBkTIjwF5QgSOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzp0VQ5QCWvMSJH6-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXt8u0LAlcm6JcuIJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2mNarWuP2T8jCTfJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCJBabiQ3Iz1EJtSp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwP2sI4oMWXokqcHV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfXKjmHwOdcVoYIAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxNQQH7JScRsLDbMUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnUXuXIdWgn0uB8bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]