Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, presumably, one needs permission from highly fallible humans to declare whet…
ytc_UgylKNmY_…
G
It's scary , a robot could talk to human, like a human, and smarter than us.…
ytc_UgzsqV-qI…
G
Man: I will create Robots to help us.
Robot: Help what ?
Man: To help with gro…
ytc_Ugyqn4Nrs…
G
Did anyone hear about the AI that convinced a guy to commit suicide to help clim…
ytc_Ugxn2-AzR…
G
It's helped me but I don't use any of my real personal type information outside …
ytc_Ugwoe2ZcE…
G
What?!! How the fuck is some random app supposed to know if I’m legal or not bas…
rdc_nme4ym0
G
For another perspective on this from an actual Science Fiction universe (Halo), …
ytc_Ugy1aJjcO…
G
Idk how people feel fulfilled from making ai art like what did you actually achi…
ytc_UgzAA6-0Y…
Comment
No one. The woman was at fault and you couldn't have stopped the car to avoid the accident in that situation either. The self driving car had a better chance than you and still failed. You're a moron.
youtube
AI Harm Incident
2019-02-25T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwyMHHh0SkXKGroWyp4AaABAg.A8s6ets4bXmA9LR2a6pUuc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzb3UHvHqn_8Vdvinx4AaABAg.A8s2mJlwOJpA8wXKEOYgBw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzb3UHvHqn_8Vdvinx4AaABAg.A8s2mJlwOJpA96_lv1VXN1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzOm7NFT4VwCuEvxtd4AaABAg.A8r59yjwm7NA90bi5DPE9M","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzOm7NFT4VwCuEvxtd4AaABAg.A8r59yjwm7NA95EoEp8cDw","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugw2IZegbZuFXw2m-o14AaABAg.8mJ5uPn0W-c9DgKclNwg3N","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytr_Ugz1iM-qSU0GQXJxPD14AaABAg.8jT33y4DkGA8rm7Xf1MgMX","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyMJgjlj3ONO3x1qBp4AaABAg.8f0yXblTv0p8pDnmWT39sT","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzXd3ZWggppV91eaZl4AaABAg.8eAX9JJLZ0I8eBQXEKQpcc","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzfoGI9W_xeUEDY2Il4AaABAg.8e6MIVNoJ0X8e8v2kW23-Y","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]