Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Does AI have a conscious?" Lmfao. Nice grammar. I believe you meant to say " co…
ytc_UgysN3D6A…
G
I think that the creative commons should also add a license about the uses of AI…
ytc_Ugy5bi0Xo…
G
“Or maybe it could kill us all”
Yes Ethan. The ai art will jump out of the scree…
ytc_Ugz59jYPS…
G
Small difference between a comms satellite and one performing massive calculatio…
rdc_ohr61z4
G
You ate the most annoying presenter. I could not listen to your whole youtube. O…
ytc_Ugwk-CXA4…
G
AI getting SA and tormented by the group wasnt on my 2025 oney bingo card…
ytc_Ugysd2jJN…
G
Its better to double check the information that the AI gives since sometimes it …
ytr_UgzBFhE1P…
G
they can't even copyright their shit. honestly think it'd be okay to repost thei…
ytc_Ugx5cdxE7…
Comment
@johnmonsterhunter-z4v It could work, but that would require every company to use a centralised system. I think a solution where each car avoids the accident using just its own data is the realistic implementation here. Since then each subsequent car would avoid the accident using its own sensor data. And if there are some industry standard response strategies, then a mass response to an accident should be predictable despite the fact that each car is working independently. A beacon system similar to TCAS could also be used where cars tell each other which direction they are headed. But I don't think coordinating strategies would work any better than all cars using a predictable response. Not to mention in such a split second response things like packet/connection loss could mean chaos if we rely on a server. This would also consider non-self driving cars since it is a dynamic system.
youtube
AI Harm Incident
2019-03-06T20:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghSFLJx8E5vKHgCoAEC.8LkSqu9hQO78MiKcDcdbFT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UghSFLJx8E5vKHgCoAEC.8LkSqu9hQO78s94GSqxFzL","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UghSFLJx8E5vKHgCoAEC.8LkSqu9hQO78sBTmNBLTnz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgjBq4Ad_q4r0ngCoAEC.8LbKUHGXzVU8xBqxgJTvlt","responsibility":"none","reasoning":"none","policy":"none","emotion":"resignation"},
{"id":"ytr_UghBZTkNu8IlxHgCoAEC.8L_C9CGmbDM8Nom49TIcsz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugj96NpyN-f2BXgCoAEC.8LXRTV4jFGF8LvMXG5aqAC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugj96NpyN-f2BXgCoAEC.8LXRTV4jFGF8MMJPY3mPht","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugi_k_2d8FQ3c3gCoAEC.8LOveCSzr648LQMy0NTQqb","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugi_k_2d8FQ3c3gCoAEC.8LOveCSzr648LkFoezgjWx","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugg_qQYiL1e7ZngCoAEC.8KuOXvYh4_g8tF1kV-xXWc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]