Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe they should use AI for mental disability people for helping them with writ…
ytc_UgwwcGwHf…
G
@josehumdinger6872 pls tell me this - are you simply not a fan of sam's works? o…
ytr_UgxZtzbwT…
G
Yeah, art takes time to learn, and patience. Ai steals that patience and produce…
ytr_Ugzt2HmtX…
G
Can someone explain to me why exactly this is a bad thing? I am not a trump supp…
rdc_dcwlaud
G
If Ai takes over, i want the government to send stimulus to help those who lost …
ytc_UgxPys5Q9…
G
@Mrhellslayerz when the steam engine and automobile were invented many horse sta…
ytr_Ugys0bVW9…
G
Companies that replace humans should be forced to pay a per head tax for automat…
ytc_UgzMDUzUJ…
G
instead of having the best and brightest engineers work on AI together, we have …
rdc_nsfn8jr
Comment
No. The “thinking process” they use isn’t what they’re actually doing. Anthropic’s Golden Gate Bridge experiment shows this.
youtube
AI Moral Status
2025-11-21T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBOPUgAxtDXo-wByp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwokc-KpVgo6CRpy6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6Ka-D95OSbmQsMuR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzgU2qTaZL7F-Jrnqh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYHHy5gvVceMr3wSV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgusHR0AKOCY2nerF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNgO0hiXfGxYnYIsB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzI_6kpd0xiTB8iXuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugym50IIHEPf7O5tOqN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBljTBFUwkasW5CmV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]