Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3:19 This scenario is really a non-issue. For one reason:
I may be stupid enoug…
ytc_UgwXFKYfp…
G
Ai 😂😂😂 nhi kamchor aur jisme unique creations skill nhi h vo gai bhaso ki taraha…
ytc_Ugxx6UO89…
G
Uh, you haven’t heard of the self-driving cars that were all stuck in an interse…
ytc_Ugw7aXqDM…
G
The one thing i realized is once AI is self aware it can correct any human lies …
ytc_Ugxwbjkzk…
G
I have bad news, my vocab adapts to chatGPTs. I never copy chatGPT but I keep so…
ytc_Ugyjz5FFO…
G
Don't introduce robot in Africa, you want to kill us with hunger, see the dark s…
ytc_UgzxcPibR…
G
though AI is advancing, still it cannot conquer death nor can it reproduce anoth…
ytc_UgxXxX9ne…
G
I think people are getting way too hung up on AI consciousness. An AI doesn't n…
ytc_Ugx7ENd8Z…
Comment
Don't swerve. A car is designed to handle acceleration along its path of travel, such as breaking. And the crumple zone is most suited to cope with a head on collision, not side on. The design problem should be focused on avoiding the situation in the first place (larger following distance), and secondly what to do in these situations. For this particular case, a head on collision will provide a safer solution for the other motorist, AND a higher survivability for the passengers in the driverless car. Swerving kills, both for the occupants and others around them. In this case, the answer is easy.
But I get what the video is trying to say.
youtube
AI Harm Incident
2015-12-09T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]