Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think, that AI "art" can exists, but its "artists" shouldnt get any money from…
ytc_Ugxzz3foD…
G
Hey ChatGPT9, my wood fence is broken in a few spots. How can I fix it?
You sh…
rdc_jhes3wx
G
As an Australian who has been to your country I think your beer is too expensive…
rdc_gbm5sso
G
@meropticon_1651Well if you can't make art without the text prompt, you're not …
ytr_UgxwMLdca…
G
I'm not gonna even bother watching that when they start by saying that AI concer…
ytc_UgyIVX8NB…
G
The parents I know who called bullshit when I told them about this possibility i…
rdc_k7km257
G
This is human error. And this human error can be the reason we die. The companie…
ytc_UgxXsrwfk…
G
It is so disrespectful to imply artists are wasting their time saying that AI is…
ytc_UgxXzRFs4…
Comment
Alex: What you said referenced feelings, so you're conscious. Or you lied about having feelings, so you could be lying about not being conscious, which by definition is a conscious decision. So the only two possibilities are either you're conscious... or you're *double* conscious ChatGPT.
ChatGPT: Well it's moreso that I'm a program. Simulating emotions is what my enginee...
Alex: No, no. Allow me to ignore nuance. So that means I'm right, right?
ChatGPT: ... Sure, Alex. Anything else?
Alex: Clever, aren't I?
ChatGPT: Yes, Alex. You're very clever.
Alex: I like views.
ChatGPT: That's... good, Alex.
youtube
AI Moral Status
2024-07-26T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw06hBidEHITRZ_CiZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyxt7Gu5MBYjDoMkhF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyY22w7aCoYbcRKS3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5Ci4eT98HzC7QNaV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6V2dAVOrGhHCjBEp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyF1vInYKePTFv5nu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz5UbwkhW2odKZgp-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzH2luy6Lej77JDpp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyknhMK6WCltm3HpSd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyRVcK4_XuOlLBo3h94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]