Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So the ONLY 'evidence' they had was unproven facial recognition with a known fau…
ytc_UgxJXh75y…
G
If you want a youtuber that talks about this stuff but more a more scientific ba…
ytr_Ugwl6zP57…
G
human time is more valueable than shite 9-5s. Passive income via ai agents is th…
ytc_Ugz76bkv5…
G
@Summatradb if there is no difference why pretend you drew it when you didn't? …
ytr_UgyUPmmO2…
G
You are correct, that is the formula for Variance used in statistics. In Machine…
ytr_UgwDDGS41…
G
@CavemanCrafts86 Emotions are not necessary to replicate human behavior. An AI …
ytr_UgzT2JDmy…
G
The moment you see crying wife of the driver, it becomes clear that the purpose …
ytc_UgzYlE4ZD…
G
AI is just like fire when pre-humans discovered and experienced it, it became a …
ytc_UgzG_YqBD…
Comment
When someone like Stephen Fry shares a two-year ultimatum, my first reaction is quiet, but clear: it’s not just about deadlines — it’s about direction. Warning us about AI isn’t doom-scrolling, it’s an invitation. Let’s not sprint toward the edge. Let’s slow down, listen more deeply, and make sure whatever comes next is built on presence, not panic.
youtube
AI Moral Status
2025-08-11T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgztZVPGZbdDmWioE-p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzd5zzMZ2PEzp-bu2R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxld4iPwcRhlHDGC_R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyx1RcnXHc_nuZMOcV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyabhwfKrPCmT_ij9F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwH7LbH2jxub2eg7jF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxvlLWOS_nZyRKMoo54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYGDfzeTKtpQ2IPXx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzqxcB_lReZO_ZhUzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNiWk-NUr_2epyrpZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]