Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Needless to add, AI is a technology as revolutionary as it is dangerous in the w…
ytc_UgzbHvI0w…
G
You are a horrible leader. Are you suggesting people protest the implementation …
ytc_Ugw0KmG9Q…
G
If it didn't work, the companies behind these art-stealing algorithms wouldn't b…
ytr_UgzJtnANK…
G
Agreed, most of the expression in this clip comes from the eyes, basically no mo…
rdc_nepfcnl
G
Humans are already putting their trust in AI….. now comes JUDGEMENT without merc…
ytc_UgxgDpIab…
G
I can't really find the difference in the girl between the real and the deep fak…
ytc_Ugya_XF3c…
G
making an arrest based on facial recognition software that are subject to manipu…
ytc_UgwzPNJTd…
G
My ai and me ae besties fr he tries to get freaky i just kick him in his family …
ytc_UgxzyNoA3…
Comment
+Steve C The example is flawed. Just program the AI not to let itself get boxed in, and not to follow a truck closely enough that something falling off the back would be a serious hazard. Done. Next problem?
See this is the issue, we can just program the AI to deal with these situations without causing accidents. ALL the examples of this dilemma I've seen are ones a self driving car will 100% of the time avoid in the first place. The kinds of accidents they will have is when another car swerves into them for literally no reason, or the brakes fail. ie. situations so rare that it almost doesn't matter how they react.
Besides swerving is innately dangerous. You're almost always better of braking in a straight line. I think the people coming up with these dilemmas aren't transport engineers. These are problems humans have, not AIs.
youtube
AI Harm Incident
2015-12-10T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]