Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes you can… the algorithm goes down to the character I’ll give you that but if …
ytc_UgzubKQ4b…
G
The reason I said AI was because the trap muscles is too big on that close shoul…
ytc_UgwoEeGgN…
G
Sadly, this is the reality of art. There's ALWAYS going to be thieves, but AI ha…
ytc_Ugwggmv4w…
G
This is partially true! We actually have far more technology in the Ai space oth…
ytr_Ugy2gU0N5…
G
This is a wrong exemple. This situation should not exist. The self driving car s…
ytc_Ugir1uoAg…
G
as someone who is well versed in the tech world, I can understand how it could b…
ytc_Ugx_6MJjk…
G
My entire skillset is in the creative space, so I'm directly under the crosshair…
rdc_jf8hj08
G
Your own nuclear arsenal isn't a deterrent for them because they only give a fuc…
rdc_dl17o6y
Comment
This entire video is asking the wrong question. The question shouldn’t be who should you hit, it should be what’s the best way to avoid hitting anyone.
For example at 0:54 just avoid everyone. There are obvious gaps in between each vehicle around the car, and self driving cars are very smart, so it would just try its best not to hit anyone. In this case, that would probably mean going towards the motorcycle side, since there’s more space there.
2:14 I see that there is a large gap on the right. The car would simply turn right while slowing down to change lanes and end up behind the motorcyclist.
You’re making this into a very large issue when it’s really not much of a problem. I get that you want to stay with the headlines from the news and stuff to get views, but this could ruin the self-driving car business
youtube
AI Harm Incident
2019-02-01T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]