Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd like to see AI delete boxes of booze and make bad life choices as good as me…
ytc_UgxUdLo40…
G
@amehcakeface What the heck are you talking about? I been doing art daily and im…
ytr_Ugy88x3SV…
G
Anyway, ima share something **deep breath*
I found an ai called "evil salon …
ytc_UgyqS22vc…
G
The world is gonna end in 2002 anyways. I mean, 2012. Err, 2014. Oh no, sorry, t…
ytc_Ugyk7kEow…
G
Hahaha so we will have to negotiate a trade deal with a united Africa. Holy fuck…
rdc_et7sr4w
G
How crazy are you when you talk to a robot like it is human? Do the inventors of…
ytc_UgxMw3tR7…
G
We appreciate your feedback. While AI technology may seem daunting at times, it …
ytr_UgyZYocy4…
G
I never will, I love listening to JRE but I get annoyed when Joe talks about AI …
ytc_Ugwb_FieJ…
Comment
Remember in I robot where Will Smith takes over the car manually due to it going out of control. All cars need to have that feature just encase some bullshit goes down.
reddit
AI Harm Incident
1475423361.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_d8asr0o","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"rdc_d8akbl3","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_d8ablc1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_d8auzt1","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"rdc_d8afscl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]