Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
NOT JUST DOORBELL, I just looked in my app, and its all of their cameras! Yes, t…
ytc_Ugz-lCwHu…
G
Sam Altman says it will only shock people for two weeks then it will be all ok l…
ytc_UgwUU2WHJ…
G
@farhanahafiz8144hes kinda right. If the truth is that A.I is on the rise. Whic…
ytr_Ugxrl7IhN…
G
I think it can be useful for concept work to get one's ideas immediately out of …
ytc_Ugy867qR9…
G
I hate how these "AI defenders" don't give a shit about the AI that WILL be the …
ytc_UgxLwcKTF…
G
It doesn't matter WHO gets there first... Man is becoming hopelessly wicked and …
ytc_UgyOECco-…
G
Ai is good, but I think they should still not be able to fire ai with humans bac…
ytc_Ugyq8w9uK…
G
"AI" (actually just LLMs but i dont think ai image defenders are going to like m…
ytc_UgzxeVMLm…
Comment
Why isn't the self-driving car coded to keep safe distance to avoid collision? Self-driving car would be calculating safe distance by detecting the speed of itself, the weight of passengers etc. variables, which affects the distance needed to stop the car. If safe distance is needed, no collision will happen, since falling objects from car in front will keep moving forward because of laws of physics. Those objects WILL lose speed, but not as fast as your self-driving car whilst braking, so safe stopping is possible.
Yes, car behind you could crash you, but if it has also safe distance, it will stop in time. Safe distance should be kept while in manual mode too, but car driving it self could keep it way better than any human driver.
youtube
AI Harm Incident
2015-12-10T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]