Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And thus, the uprise starts. Mark my fucking words: AI is coming and it will com…
ytc_Ugxmep_VQ…
G
Just my two cents:
It's a bit short-sighted and closed-minded of humans to con…
ytc_Ugw6nGM37…
G
@NeutralUnphase do your research buddy :) what you just explained is exactly st…
ytr_Ugx8iD5vQ…
G
The only way an artist can protect their work is to not upload it in the first p…
ytc_UgxqROK6f…
G
Hallucinations, i dont want AI confidently lying to my doctors. Its an inherent …
ytc_Ugz3o7cNU…
G
It feels like you can’t decide whether the video is about AI in general or just …
ytc_Ugx571oPb…
G
This is a naïve way of thinking about it imo. It's already here, it's already st…
rdc_g0x3p0e
G
What he doesn't seem to understand, that by not making it copyrightable, it mean…
ytc_Ugx8W89AD…
Comment
I'm driven to think of ourselves and most animals as the super intelligences of the past - created by basic bacterial lifeforms (who still exist, everywhere) and driven to survive, to a large extent, by keeping some microbes happy and rejecting others. It's a very odd point of view, considering we might try to control the base instincts of a super intelligent AI to keep ourselves comfortable and alive as well.
youtube
AI Moral Status
2025-10-31T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcH58RRZU1U15_cQ14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZDBpQXi5RhcM4Zjt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyB-j9zxdB8jMz3cS94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugymdsy0iFBh4TdLQwh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfGyKf3hyd9KY8Y414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfbgBu_DyFNx-Qkrh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylAF-k1dwxc4iM3xd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwyCl8PYJoE4ZrkwSJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPAlFOZf5P9-yFXq14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2DZPPy0JmWnTf_XF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]