Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The world has gone to shit, of course, but it doesn't mean that there's anything…
rdc_f1xvg9u
G
It also doesn't FUCKING help that disability assistance programs and public heal…
ytr_UgyeyA1UW…
G
AI does not Steal, if you hear music on the street and you to play something ins…
ytr_UgxPzNKZv…
G
Made this robot the for kill citizens in the future 🤔I don’t believe in governme…
ytc_Ugx7G1IOk…
G
"If you want to see a vision of the future, imagine a boot stomping on a human f…
ytc_Ugws73xes…
G
@MajorPerceptron I think you're just dumb I guess. Openclaw is trash and got exp…
ytr_UgyjaI2qA…
G
Reasons for Tesla Autopilot Crashes:
Reason #1: Nobody's driving the f**king car…
ytc_UgwUgmr3H…
G
In the 90s we called it adaptive optimization. Basically what we currently have …
ytc_UgzcdBpQK…
Comment
I don't see this as an actual issue. I feel like the current rules of the road can work well with this, and this problem would only become less relevant as more and more cars become self-driving ones. The rules of the road say you need to keep a certain distance from cars and since the falling object didn't fall directly onto the self-driving car, it can be assumed that it's far enough away. Why wouldn't the car slam the breaks for you, then the ones behind you simply slam their breaks too?..assuming that they're also keeping enough distance to be able to react. Then, later, this problem kinda solves itself once we introduce more of these cars because they would communicate with each other, then that object would not even be noticed by the passengers or other cars.
The only flaw I see in my logic is human reaction time for those in a human operated cars. Maybe I'm missing something though. ???
youtube
AI Harm Incident
2015-12-09T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugi5b5pbaFA4-HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ughno_FgymJ6c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Uggj7mDHrma5v3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggNqFSJ4vIgCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggrI2xcSyTYu3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiaKBAMwPkUFngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj26mPeq39upXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghpeMaaKGI5aHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggWaE2jszLulngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghT2utfCq_hLXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]