Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, I was with the comment right up until that last sentence. Cloud and SaaS a…
rdc_n9ihyj0
G
I don’t think chat gpt is going to give them your name though. Obviously it’s go…
ytc_UgwGGfgwW…
G
yes chat GPT-5 already released and people pridict AGI will come out in 2026, so…
ytr_UgxI9lNHo…
G
@kiyoponnn yeah I know but they won't go out and make AI to destroy people priva…
ytr_UgzQXZi2u…
G
Can't they just use a whole bunch of art works already there in their storage? W…
ytc_UgxRSDr_G…
G
"AI Could Wipe Out the Working Class" it's only a problem if the wealth isn't re…
ytc_UgwqNC-Dp…
G
Both self-driving and manual cars have their flaws, but self-driving is safer. H…
ytc_Ugz4gPv0U…
G
Ai wont have anxiety because It doesnt have a body, and the unescapable awarenes…
ytc_UgyWoQ_E9…
Comment
The only reason to explore making AI capable of suffering or having an identity with moral worth is to thoroughly understand how to avoid it. We’re pretty confident we can build machines that do nearly anything we would want without this. We never need to build pass the butter robots that suffer from the cruel idiocy of their creators.
youtube
AI Moral Status
2025-04-04T14:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyy1rihbMVh5Gm1lht4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuKT3-UPboTK0ttEl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3QECLseQAJU31HOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNw45KB6TM7ZfAJml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwhKyNDzzKSWqeO0VB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwGWaaI5CMpBNNRkkF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqtY1nBXu5nSOjwdR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz6Qj55YajPtfGOE6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFFFSKz5cvsVZYpth4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy10WI7WkkILfjEGEp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}
]