Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
everything you say is right. and yet, NO MATTER how convincingly we describe and…
ytr_UgyDN1IUA…
G
ur already use to ..even ur comment u can tell ur English power by Chat GPT 😄😄🤣…
ytr_UgyrNdRmI…
G
For those who are wondering, he is talking about Eric Schmidt and other such peo…
ytc_UgxFE0msl…
G
Robot don’t go to the bathroom and forget to wash their hands. Robot will also r…
rdc_j3y8nqt
G
Just another attempt to saddle people with debt. Debt is the big business runnin…
ytc_UgxuD2cwq…
G
I don’t know why a bigger deal hasn’t been made about the inevitable convergence…
ytc_Ugw49cVqO…
G
Ok…so get to the point…”What part of A.I. are we supposed to fear?”
It actually …
ytc_UgxI3FWvj…
G
Can't help it, but (not-jailbroken) Max is pretty charming, even when he tells u…
ytc_UgyWVPnOe…
Comment
If this gonna happen the self driving car will have a safety distance, what you point out is a situation where you cant react on an accident. In german we are told to have half the speed (km/h) in distance.
Anyway you point out a few important things.
My guess: IF the car is programmed well, you can react - means your car can react and by that no accident will happen or just little one.
youtube
AI Harm Incident
2015-12-09T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]