Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't need a robot giving me chill convo over a trouble... that isn't good cus…
ytc_UgyIO15rD…
G
In a future where superintelligent AIs (SIs) coexist with short-lived humans, th…
ytc_UgxxQlUQt…
G
Okay, so I do use AI for reference images but not for things that are very gener…
ytc_UgwTH7BcK…
G
It doesn't think....it simulates thinking. There is nothing there to think. What…
ytr_UgwsK1Cn6…
G
you brutally nitpicking on the AI while it shows a happy and cute pikachu is kin…
ytc_Ugz4A5lgw…
G
Unfortunately now they’ve come up with ChatGPT five I tried this and she would n…
ytc_Ugyd8GGbU…
G
It's pretty ridiculous. I used Google lens also. It was a pic of the end of a ca…
ytr_UgwhelrLU…
G
gurl your art is too ugly to be considered a.i. 😘(/jkjkjkjkusually Sarcastic so …
ytc_UgzNKHk-c…
Comment
Am I the only AI professional that gets very annoyed by this guy? The jump from minute 46 and 49 is immense. It assumes AGI from the way he is talking about it. AI today doesn't think. An LLM is something where you give it an input, and it gives you an output. It's very advanced compression. The jumps he makes talking about self preservation is entertainment value and assumes technology that doesn't exist.
youtube
AI Moral Status
2026-03-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw_aEXTFogAnQ2YMMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytV1pB9MINc2dSpMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxirK7zMYMdyUSLAzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmTc702KrCMa97eUl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw0R-e1dSRDU2umLYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpvyvIn7j1qgSg9Lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYnZAcijKqJ6uVF6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvGmQ29xS0swi0S2B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlJloebKr_q-5LDah4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8t3JtLkyvFanpHgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]