Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Agreed 100%. We are in an AI bubble, and 90% of all AI startups, tools, and appl…
ytc_UgwjRvmGy…
G
the thing that pisses me off the most about a few ai defenders is that some say …
ytc_UgyRtTaIk…
G
And researchers are forced to take whatever funding they can get to keep their l…
rdc_emuo6nc
G
Honestly, I think ai art should be used for inspiration.I was to have a hard tim…
ytc_UgxQlzTyK…
G
NEAR END: ASK THE AI FOR IT'S CONSENT??? WTF? Are you high. He sounded intellige…
ytc_UgzylkhZE…
G
It has been proven that face recognition tech has trouble with POC faces and yet…
ytc_UgxHWxIOM…
G
It's super early. Wait five years when AI exponentially improves, companies beco…
rdc_kyzat7q
G
6 months later david shapiro has an interesting chat with Claude 3 about this ex…
ytc_UgwQoaQJI…
Comment
In real world this situation would never happen. Let say the only self-driving car on the road is behind and truck, without factoring what's on the left and the right it should have kept a following distance of 6seconds. Now to prove my point I need to do a little math here. 1 miles is equal to 5280 feet, at 60MPH every second the car would be traveling at 88 feet per second. and the following distance should be 528 feet. Let say the obstacles drop from the truck and move toward the approaching car by 50 feet and the car took 4 whole seconds to register it and take action. At this point, the car has 2 seconds which equate to 176 miles minus the 50 feet that the obstacles had moved mean the car has 126 feet to brake from 60-0, I'm not sure if you know 126 feet from 60-0 is actually super easy to archive with most cars these days. Also, factor in the fact that those Self-driving cars often comes with superior parts would mean this should be no problem, also 4 seconds delay is super long. I doubt the car would require that much times, however even if it did it would still brake in time.
youtube
AI Harm Incident
2016-11-21T06:0…
♥ 260
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgjGy_ree2B0EHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj96NpyN-f2BXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghqMvbGky59jHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_k_2d8FQ3c3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg_qQYiL1e7ZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggUGDnRAEQYy3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggfRtqOpBkxgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggNnXWdPpcRW3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjqog_GKULDRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghYlkS6IWtLL3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"})