Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No need to shout...anyway we will all be safer when self driving is more reliabl…
ytr_Ugztpn6wB…
G
ChatGPT: Just bring your hands together and clap
Alex: No, I can’t. There’s inf…
ytc_Ugw1G0c-0…
G
@thewannabecritic7490 The slop comes from low effort on the human side. It wasn…
ytr_Ugwuzy32e…
G
I don't know one way or the other, but that modern Luddite movement may prove to…
ytr_Ugz402FQ8…
G
@syzygy4669 yes, i know cursive and I know how to use a computre, AI will be a p…
ytr_UgyAlRost…
G
Are the scientists trying to boost up the capacity of a human's brain through th…
ytc_Ugxl2FyK4…
G
by the time you become a qualified plumber -4 years in this country , there will…
ytc_UgyisA6yf…
G
Something i realized recently with traditional vs digital is that
What you learn…
ytc_UgxRWetlv…
Comment
My thinking on this is that if the same situation was given with 1 alteration I see no REAL issue here. The one small change is that if ALL cars are smart/self driving then each car on the road would see the immediate danger and avoid it, while the others do the same. So when the first car must avoid the falling object the other car would avoid the now swerving one. A domino effect of sorts, now I can also see this leading to multi car pile ups, however that is to say any incident currently can lead to the same with smaller margins of error, a human driver can be distracted, the machine can't leading to over all fewer collisions.
youtube
AI Harm Incident
2015-12-09T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg0f6gYDoM2u3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8lvN9vbzqmngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOKnl1PCwC5XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghcyXISlo02pngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi2qolU0hnvF3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghXbi7fYusSUngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2jxTP45dbIngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiMqqVdmeG89HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-Xh3Fxwz1RXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3GHPd7ug6D3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]