Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I genuinely love how pissed the AI gets whenever Dave brings up his Astro Bot sp…
ytc_UgxvjlDd_…
G
Self-driving cars should have a way to communicate with each other to decide wha…
ytc_UgyX9d-78…
G
And then Austrian painter enters the chat. What you think will happen when "no o…
ytc_UgzfXb4qM…
G
I will spend my whole life to this Ai cause there's no one good people on earth…
ytc_Ugyvv_50N…
G
I disagree with your statement about a fixed worldview. It depends on the source…
ytc_UgyIqqhjP…
G
I really hope this is the decade that lets us kick cancer to the curb through te…
rdc_fcsnyog
G
When you turn on your computer and your AI s says, "Hey, I had an idea about how…
ytc_UgykW0vEW…
G
Even if ai doesn’t destroy our way of life either economically or physically, it…
ytc_UgwPQoK7h…
Comment
do whatever will minimize bodily harm period.
2 motorcycles on ether side? the car takes the hit.
head-on? swerve into a suv
yes its a cold decision but the best overall.
now people don't want to buy that car? make a law so all self driving cars have a even playing field....
youtube
AI Harm Incident
2015-12-09T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]