Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who allowed essentially industrial plants to be zoned right next to residential?…
ytc_UgxYx_E3-…
G
Similar to Medicare and Medicaid, maybe we can have both socialism and capitalis…
ytc_Ugx__aS6P…
G
Oh yea very realistic we ll have 300 million people running their own virtual bu…
ytc_UgzsnQHIW…
G
Sam Altman has a moral compass by swindling the people out of $billions and Musk…
ytc_UgwFSh2gy…
G
i think "AI art" is only good for when you need a cheap way to get a drawing and…
ytc_UgwPOBXhy…
G
He’s definitely an intelligent person but he’s attached to this idea of humans b…
ytr_UgzrHnCh5…
G
Because I love my family, I would put them in a Waymo where available rather tha…
ytr_UgwSYprJ1…
G
the hallucinations remind me of Sir Penrose's argument, that AI will never be co…
ytc_UgwoDM_JG…
Comment
There are going to be accidents no matter what's driving because people are friggin idiots!! On average, 5000 pedestrians get struck and killed by motorists every year. The fact that the lady walked out into the street in a manner that the car couldn't see her quick enough to avoid her is irrelevant. She's the one that didn't take responsibility for her own safety and make make sure it was safe to cross the road before doing so. It would have happened regardless if it were an autonomous or human driver.
youtube
AI Harm Incident
2018-03-23T13:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyDIFPstXKa3ha1ywV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugys0phjno1Zbkh0idh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzCVZnJea73RcVj1DN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzdbNCE9fYX-3rOdYF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"discomfort"},
{"id":"ytc_UgxSU2YDmMs3La1xwqB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxlcIt5Lak7SCXJDPt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnxFg4PMC74cJ0qJ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyuevw8_A-m_ukiqMN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgztScT7ln6OBsAvKH14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzVZMPvbaPdlMGN7bB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]