Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Americans need to pay attention! We have a lot of AI taking place here already.…
ytc_UgzGsfeD5…
G
In short:
Writer and Artist decided to make a copyright infringing graphic nove…
ytc_Ugx9zc0qi…
G
I don't like people... except hot girls lol, so I'm all for AI. We just need bas…
ytc_Ugy3aV28r…
G
Moving people out of poverty creating a goodwill and also a future market for am…
rdc_dcwbhz5
G
@KunyareBloggeryou'd be surprised of how much of a bitch it is to actually get …
ytr_UgwpZKniZ…
G
Remember the movie on Disney where the robot got mad at the owners and came to
L…
ytc_Ugxu2CECC…
G
Can AI offer a scenario to enable Renewable Green Energy economically viable? T…
ytc_UgxEo584F…
G
To be fair, I think ChatGPT only added the heads up about it not necessarily bei…
ytc_UgwgvzfXs…
Comment
The first question I always ask when it comes to ethical dilemmas: Can we avoid the situation in the first place through beneficial and reasonable means? In this case, it is absolutely trivial for the car driving behind the truck to be driving far enough back to react and brake in time. In fact, that sort of thing is already being done by automated vehicle systems because scientists had some common sense when designing them.
This particular scenario can still arise when the systems fail, in which case there should be backups. And if the backups fail then the least bad choice should be to simply brake and avoid hitting any other vehicle since the risk should be on the people in the vehicle, not the others around them.
youtube
AI Harm Incident
2015-12-09T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]