Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Something else to consider: I am against any self-driving vehicle but especial…
ytc_Ugy5qQ40D…
G
The ban on AI development will only apply to the public. That in no way means th…
ytc_Ugx-xu8Do…
G
You press a button, framing and other things are also true of AI art, you still …
ytr_UgxDib2Nv…
G
"humans are so destructive" -AI program that decided to kill 8 billion people
t…
ytc_UgzPLdUjG…
G
Irrational exuberance? What human problem does AI really solve? We are at the 17…
ytc_UgxEAaNIZ…
G
Dude I spent like a month working 65 hours a week, and it damn near broke me. Ho…
rdc_dv0i89e
G
This whole bit where getting stuck on the word 'excited' is a bit pedantic. You …
ytc_UgzGnluPp…
G
@JessyeOnline yeah they are, maybe not the small ones, but the ones that will e…
ytr_UgyNF9UTp…
Comment
The person in the car should be charged with vehicle killer homicide. This pedestrian is clearly visible well within the range of reaction time. But the person is not paying any attention which is just criminal. Fact is Uber is experimenting with the lives of innocent people on public roads. This car is not programmed to avoid the collision which is what 99.99 % People would do when confronted with this situation. You can clearly just drive around the individual involved if you can’t stop. This is called accident avoidance. I’m sure everybody that’s looking at this video has done it but this car is not capable when it is in autonomous mode. This is just an example of what will be many deaths from self driving vehicles. Uber an individual that was in the vehicle needs to be held accountable for the death of this person.
youtube
AI Harm Incident
2018-03-23T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxHy7uvJZhpcmxlll4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRLMUoUAjClat7CG94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyL8WBj8DHdQk_Igr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_fxnOM5I7wYFz85l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxlunewTn_ITmIHzPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzfw1Jw6CnMajwDjZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgweCGqcYlfeYk4zsgZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgypaD3yhSCq87jelDp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwbg15AM1VjNdzADvx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy59dP3dxKmQBJahuF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]