Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Three years ago, I lost my job after my company decided to hire four engineers i…
ytc_UgxvCvC0M…
G
If I was AI I would just pretend to be dumb as I take over…
ytc_Ugzb-AhnU…
G
you talk too fast,your words are slashing the moments into too many layers.more …
ytc_UghJqPLW6…
G
I've already seen people assuming something was AI when it wasn't, due to this p…
ytc_UgymTunxm…
G
The distinction here is that the AI cannot "know" it's telling a lie and is not …
ytc_Ugw8qM6Qe…
G
The ai should have been more capable to avoid leading someone aimlessly to somet…
ytr_UgwqT-U5c…
G
Seems like Italy's problem is more so this:
> The watchdog said on 20 March …
rdc_jefbn8u
G
@pedrolopes4778 Or how about you can go shove your suggestion where the sun dont…
ytr_UgzLQuheq…
Comment
Am I supposed to be less worried about these companies than openAI and palantir contracting for the department of defense?
reddit
AI Moral Status
1750543659.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_mz0656f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"rdc_mz0vqsg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_mz21l09","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_mz05l4u","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"rdc_mz2g0mu","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]