Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I personally don’t like A.I art because it’s taking bits and pieces from other a…
ytc_Ugwhyz3v4…
G
It's not the technology that is the problem. It's the corporations and corruptio…
ytc_UgzdqsrNa…
G
When someone else overrides control of their AI and uses it against them only th…
ytc_UgzUa7tdq…
G
Very interesting conversation. I think his concern about AI safety is warranted…
ytc_Ugy2xy34h…
G
I wish the AI "injection" would stop. I want to dial back a bit and go back to "…
ytc_UgxUhGWMp…
G
Well if someone produced AI porn of you fucking your mother, I think you would h…
rdc_ks6was8
G
The AI is NOT safe.
My dad and all the other people who had the same job at IBM …
ytc_UgyKlovs6…
G
The really scary part. No one will know any fact from fiction about anything goi…
ytc_UgwiWnjHC…
Comment
This makes me realize how dangerous interacting with a LLM could be for people with mental disorders like schizophrenia.
reddit
AI Moral Status
1734383177.0
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_m2e1uv3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_m2emc63","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_m2fh3zy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_m2cf0lv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_m2gggod","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]