Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i was going to say that this is a deepfake ......but alas, a lot of people have …
ytc_UgwaUI4Kd…
G
And now in 2025 we got sites like "MambaPanel" who do similar a service. I mean …
ytc_UgwORoXaY…
G
Umm... that's not whats happening here and this video is years old...when AI was…
ytc_Ugzumfsyb…
G
@Bento_AUS_3059 Australia is a strange place. I’ve never seen an artist ridicu…
ytr_Ugx6q57TP…
G
AI is the new black, and I'm fatigued already. I'm just waiting for the self hat…
ytc_Ugyh-25U7…
G
I disagree that's like saying an ai can't understand it's processor. It's a stat…
ytr_UgyoY44OV…
G
It’s one thing to create AI art for the fun of it, but it’s quite another to cal…
ytc_UgwdnO_jV…
G
It all comes down to statistics, they will make mistakes, but if they make mista…
ytc_Ugy72D3wT…
Comment
Yes and no. If it ever gets the point that AI have human level intelligence, emotion, and sense of self preservation, than yes. But frankly, we should probably avoid creating that in the first place, if the interests of a self-aware AI is ever benefitted by nuking the planet, we're all royally fucked. As cool as my boy Zenyatta is, we'd likely reach HAL 9000 long before we ever reached him.
youtube
AI Moral Status
2017-02-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ughl6WSLm9wCB3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjpW_cqqeU343gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgiYhlUpCB2i23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0N_B54KvacngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjGkGMrvCMT_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj8xpx1PUjL6XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghP6IRxjakkx3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi2WXL0T1TMH3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjCRASqFFZCF3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiPBTwclustlXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]