Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think this argument will hold. The best Ai work being done at the moment…
ytr_UgzImw__-…
G
clearly a successful rage bait. Generic out of the box free AI slop, no way he a…
ytc_UgwJ1emBt…
G
I think your comments are very valid, unfortunately I think the future is fraugh…
ytc_UgywY8dCA…
G
@vespertellino can you explain the distinction precisely?
near as i can tel…
ytr_UgzPvg2b4…
G
Does it really matter they can't program AI any worse than they have the teacher…
ytc_UgzUabgVv…
G
Here's an interesting theological take: the fear people have for AI is the same …
ytc_Ugwzm1wch…
G
Remember the episode of the office where Michael follows the gps directions into…
rdc_i2u1bu7
G
Thank you senator. Powerful words and brave of you to share this. I think the be…
ytc_UgwTwLo-d…
Comment
In my opinion it is clearly the fault of the driver. The bike and the woman can't be seen in the dark, but ceep in mind that cameras often see less than the human eye at night (Same street with another camera: https://youtu.be/CRW0q8i3u6E ). And even IF the sight were as bad as in the video, you are not allowed to drive faster than the point "stopping distance (with reaction time) = distance of sight". This applys also to automatic cars, because otherwise their driver can't react if there is an failure. Which was here the case. Driver distracted, Lidar and sensor failure (car didn't brake at all, Lidar systems usually working better at night). Uber develops their cars on the street.... Other companys are using slow cars (Google) or test courses not without reason.
youtube
AI Harm Incident
2018-03-23T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzCQ44Cg1Md9zU1EhF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpsJ__r0N7gbEHzNp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwlQI3p3kX7MPYT5Y54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx827I7qz11nS6-KKN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyQL2iyLyyfUKTYHuV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwOp0N0eJQbbJGK7FV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxDiYRj-r_H2-7ZEDh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyr7nzZGq9xk_Jlyr94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwoCxiJBTSmYhvxVNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPiLcv4IIOYGr7E4B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"})