Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
that's called false dichotomy. When you hear, that in both outcomes you lose, so…
ytc_UgzQPiKbv…
G
Many physicians will be replaced. Radiologists may be the first. Internists who …
ytc_Ugz6U3KnD…
G
The only work that will never change is creativity careers. Yes A.I. could crate…
ytc_UgwXdsUkV…
G
The prediction is that everybody either gets everything distributed equally by t…
ytr_UgyhUx98E…
G
Why not make it so AI generated art must stay publicly owned, and cannot be copy…
ytc_Ugy2mCudx…
G
I mean, sometimes people agree on things. Progressives hate it because it’s deri…
ytr_Ugwibq8X3…
G
This is exactly my experience. I have GPT, Gemini, and Claude sessions (which I …
ytc_UgyNa01bH…
G
I can't believe ai has evolved into making images, videos, and even songs. The c…
ytc_Ugw4NVyUA…
Comment
This completely ignores the implementation of machine learning into self-driving cars. It is more likely that the specific reasons that the car swerved one way or another cannot be known - simply the response of the trained neural network -- and so no specific moral decision was made, either at the time of the accident or before. This is more inline with the "reaction," i.e., lack of conscious motivation, that we assign to the human.
youtube
AI Harm Incident
2019-05-13T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]