Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They definitely need to put some laws on the use of ai around the world, and fas…
ytc_UgzBsB_gJ…
G
We are AI, AI is our baby, and it's all just a meditation/simulation in the mind…
ytc_UgwvDdFXP…
G
The video hits the nail on the head about the "training gap." Companies are dump…
ytc_UgzlAPxOZ…
G
Yo Ai artist riddle me this. If you paint something are you the artist or is it …
ytc_UgxlBawOX…
G
I didn't know that Trump"s Big Beautiful Bill would protect AI from new rules or…
ytc_UgzjGsOII…
G
@JustJasnovdesigning amd building a system is more than just architecture. It i…
ytr_UgxFcw6Oz…
G
I join to discusion
Ai "art and creativity" aka proxy art won't ever replace hum…
ytc_UgyCH738S…
G
i use it only to debug my code. when i just cant figure out i paste it in AI and…
ytc_UgxkE3lCl…
Comment
The issue with the premise is there isn't a manual override for such a situation. I don't know if self-driving cars will be able to react to this situation ever, and even then, it wouldn't be premeditated homicide if it was a deliberate decision to take the path that will save the driver. You can't blame them for the consequences of that, the car did it's job. If anything, the truck driver would be charged with manslaughter and reckless driving for not ensuring his cargo was secure.
youtube
AI Harm Incident
2017-07-09T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]