Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont mind As long as the person is not toxic, before it wasnt okay but after s…
ytc_UgwHPVILu…
G
This is the most ridiculous study and video ever. You mean to tell me that havin…
ytc_Ugy1GFvPP…
G
Wife; i have a headache.
Robot girl; i need to recharge battery....
Man; can i b…
ytc_UgyqkxJki…
G
We appreciate your feedback! If you're interested in engaging with advanced AI m…
ytr_UgwIApRF5…
G
This guy is just butt hurt he got booted from the very company he is bashing on …
ytc_Ugxhu-LVQ…
G
I think the argument he is making is that the more information you have the bett…
rdc_deu9nnb
G
What do you think of Dall-E 3's latest changes where any scene with humans must …
ytr_UgyA3kG6r…
G
"Thanks for your comment, @edynujaidi1966! Is that a real robot? Well, let me te…
ytr_UgxuC7jfl…
Comment
I know there are reasons. hose reasons will be greatly exasperated if they substitute learning life skills with talking to a bot. It will get MUCH WORSE.
You do understand we recently seen an AI company pull a model for one day and some people verged on suicidal over that ... that is a problem, no? Like, something we'd not want widespread?
reddit
AI Moral Status
1754757762.0
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_n7t2dlu","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"rdc_n7st4c2","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"rdc_n7stjpd","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"rdc_n7tgcgu","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"rdc_n7tky4j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]