Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If he knew it was ai why would he be dumb enough to fall for that…
ytc_UgzBIVrjz…
G
Bro did these comments not watch the short😭🙏 Kreekcraft literally said it was a …
ytc_Ugx9jAFhj…
G
Not topic of the video but whyd we want to stop robots and ai in war? What does …
ytc_Ugzb2sK_z…
G
HAHAHAHAHA mate if you think character ai is bad… oooh boy there are some worse …
ytc_UgwOzpVBB…
G
As someone in graphic design, coding, web development, and UX, this man doesn’t …
ytc_Ugz-2QceZ…
G
Don't know about the rest of you, but I find this very concerning. What if some…
ytc_UgyqJHzWd…
G
@Crysta11ize People put a ton of effort into making AI though, that stuff doesnt…
ytr_UgwjrbMfw…
G
Being it on, if you haven't learned how to hunt, fish, farm, amd garden, you're …
ytc_UgwlcDGbH…
Comment
yah. I still think LLMs are fancy autocomplete, but maybe we underestimate what many of our internal mind processes are actually just that. However, for true AGI i think we would really need world models where way more than just text is fed. Right now LLMs I think is just a big echo of the accumulated human knowledge found on the internet.
youtube
AI Moral Status
2025-10-30T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwtxs-CncYNop_0tsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1yXpN6_mw1Mbo2jJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyB7ndzWaq0zAswosp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyExt2nRhtNP0DqbJ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxA2eJgnKVc_b_B4TZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUwm8CoQz08K_rFqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgznTGQfXRmC1stpMRR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyMf4BIlZYdS76nVbt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmxwLGLl4MRC3aboN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4EwcGESnFAEd1Obp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]