Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate AI, if i f^cking lose my patience by a bot, im never using ai
Plus: i g…
ytc_UgzngJeL4…
G
@G7X4P9Q2 yeah ADAS are not fully autonomous. And outside of the US it's far wor…
ytr_UgzQFqu9J…
G
Or they invested heavily into AI and are making cuts because there have been no …
rdc_oi1q0s7
G
Same with quora which makes you use your real name and makes it all public. Eg …
ytc_Ugx-ednJu…
G
ChatGPT taught me to advocate for myself and saved my life when I was turned awa…
ytc_UgyJO__U8…
G
I think it is not an Ai video i have seen in the another angle videos also...…
ytc_UgwlUwUz4…
G
The problem with hallucinations has recently been fixed. Basically, in the past…
ytc_UgxgX7gyx…
G
@totitelevisionshow while I get what you mean I do think your confused what you…
ytr_Ugyo6hc7s…
Comment
Here are my thoughts on the spot:
1. Don't most of the self-driving cars teach themselves how to drive better? So it may be no ones fault.
2. Isn't the whole idea to have basically every car be self-driving and communicate with each other? In that case, your car tells the car behind you that you are going to brake hard so that car in turn brakes (although it could be a problem while self-driving cars aren't wide spread).
3. Drivers are supposed to still be ready to take control of the car at any moment. So I would see the car braking and then the drive grabbing the wheel and serving. However, I do believe it should be the car manufacturers responsibility to inform drivers about situations like this so that drivers can actively avoid getting boxed in or getting into any other dangerous situation.
4. Self-driving cars could be programmed to recognize situations like this and issue a warning to the driver to change lanes or something before disaster strikes.
5. If it really comes down to it, the most ethical way might be to have the car recognize the situation of whatever action it takes, someone is getting hurt and have RNG decide between serving to either side or braking (since I doubt any human would just drive right into the cargo unless they didn't have time to react).
youtube
AI Harm Incident
2015-12-10T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]