Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi, software engineer here. The issue with the jobs people are trying to make AI…
ytc_UgwiD97bW…
G
Can you set the trigger action to answer the phone automatically without a form …
ytc_Ugw35ztYe…
G
Doesn’t matter if they stop monetizing ai content. If people are watching those …
ytc_UgzaZ2jIG…
G
It will just stop using power brakes... The car behind will stop too... And self…
ytc_UgzjIPWV9…
G
So, if we humans are living in a simulation in which unchecked AI progress will …
ytc_UgyyGegSn…
G
That's not entirely correct from what I know. Valve warns developers if they spo…
rdc_jwv2xjh
G
@theskeletonboi There are several flawed assumptions and fallacies in the state…
ytr_UgxSQKgIx…
G
Wait, what? If I identify as a robot, can I buy a full auto Tommy gun? I'm a rob…
ytc_Ugwq1ndlr…
Comment
I bet the system can already take out aircraft if someone was willing to spend that much to do so. They just need to change the name.
reddit
Cross-Cultural
1501585385.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_dl0cz3w","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"rdc_dkzoqgh","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_dkzyfdk","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"rdc_dkzqcjk","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_dl09ehn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]