Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should take jobs of the youtubers who are below average but dishing out gyan.…
ytc_UgyXyPYL_…
G
it must be like for human service you need to pay a premium than regular AI low …
ytr_UgwyK5L4T…
G
Your asking why did they layoff 3,000 people..? All business usually end up with…
rdc_czl8z0q
G
Imagine someone used ai to do that to your mother and spread it around. I’m sure…
ytr_UgzkcHXr6…
G
That's an interesting take! Sophia's name does have deep roots in Greek philosop…
ytr_Ugwk4aUEj…
G
I suspect the Prof is secretly a T-1000 sent by Skynet to speed up a sentient AI…
ytc_UgzcrkD1t…
G
That will be two parallel societies: one living with AI, the other one living wi…
ytc_Ugz-biYo8…
G
@Sweetdude64but it literally is, how is it irrational? It’s an obvious fact tha…
ytr_UgwEL8Yjd…
Comment
“The DEFIANCE Act would impact individuals, like those Grok users creating deepfaked nonconsensual intimate imagery.”
What about the platform providers? They are the ones enabling the content and with the $$ for lawsuit payouts.
The Take It Down Act, made it a federal crime to post nonconsensual sexually explicit deepfakes and was signed into law last May. yet that did not stop the content from circulating on X. I’m not confident this will make a meaningful difference.
Section 230 of the Communications Decency Act, which protects platform providers from liability for user-generated content, should be repealed. Until they are held responsible, it’s doubtful much will change.
reddit
AI Harm Incident
1768344347.0
♥ -4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_oi43aln","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_oi4alg9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"rdc_ohzk3nq","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"rdc_nzflqa4","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"rdc_nzfro26","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]