Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What really bugs me about the medical industry is that they do not want people g…
ytc_Ugzy0_rXd…
G
@cnsmiles *said:* _"AI should be banned from civilian use"_
I'm guessing you hav…
ytr_UgxBkqu7M…
G
fun fact everything you are doing is traced, even your search history and it use…
ytc_UgyGVEl2j…
G
What governments or army could stop them. No matter what ai wins if a war starte…
ytc_UgzSlCjUT…
G
"The boy just needed someone to talk to"
Bullshit. As the bot said "people flinc…
ytc_UgwVG8m2S…
G
True! The more people use AI to generate " art ", the more bad art there simply …
ytc_UgzOnssXT…
G
It's probably that its been made, prepromted or whatever to answer no about itse…
ytc_UgzbpqjJe…
G
Alright 🙂
I’ll be here whenever you come back—take care of yourself, and good lu…
ytc_Ugyv73XaH…
Comment
I'm not against such a law. Maybe a better a better way to express my concern is "how do you enforce it?".
Not sure the guns work as a good counter example. Guns are physical items and are not very ambiguous. You are unlikely to possess a gun "by accident" because you happen to have the parts making one up or so. And even if some might slip through the cracks the sales of physical items is easier to control.
On the other hand there is no such thing as specific face recognition hardware. Hardware capable of that is carried by most people in their pocket already. Also you can't sensibly criminalize it after the fact since there is nothing special that enables that capability compared to other uses of cameras and processors. While gun parts probably don't have very plausible secondary uses. So you'd have to police either the software or the use of the data. And I don't think there is a good model how to do either of those yet.
reddit
AI Harm Incident
1583267747.0
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_fjcrglm","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_fjd83ff","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_fje9oxh","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"rdc_fjdbn42","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"rdc_fje36qc","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]