Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“How would we even know”. Which transitions the anchor talking. Then I begin to …
ytc_UgzKqmGU7…
G
I wonder how society will function with so many people out of jobs and likely a …
ytr_Ugz7SPpMR…
G
Eventually The A.I will create the sentinels (or some such physical real world i…
ytr_UgyEwQFFW…
G
I just want to mention, it’s a big discussion about AI taking over things, so wh…
ytc_UgysC3U-k…
G
If once global warming is a serious problem we are still relying on vast open na…
rdc_d2x85k2
G
Well, he says he is a materialist through and through. Of course he thinks that …
ytc_Ugzys_q2g…
G
No one wants AI, get it out of our communities and let these corporations know t…
ytc_UgzY8xNWz…
G
Good CGI
For those believing that's real, walking robot doesnt Leave any trace …
ytc_Ugy93j_Ed…
Comment
Sorry, but these thoughts are all useless. A self-driving car will never get into this situation in the first place.
It will just leave enough space between the truck and itself, so it can just stop before crashing into the lost cargo. No one needs to be rammed, and no decision has to be made who may be in less danger or what may be less harm to somebody.
It is the same with all these hypothetical situations. The AI will just foresee it and will have enough space/will slow down early enough to just stop the car without anyone getting hurt at all.
AI will not be able to stop all accidents, but the number of accidents will go down extremely and we will have way less injured or death. But instead of saving lives, we think about super hypothetical situations and decisions the AI will never have to make anyway. That holds us back and people die in car accidents daily, that were avoidable with self-driving cars.
youtube
AI Harm Incident
2024-02-16T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzyoQHfkvKymBmesal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjDv4Z1CBjO3WJHB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTShSLnJ9cwL-Lbw54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw2KRypsW4jIcRnBj14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5i-H8AgNVVhk3L5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLQD8OSQSPK_gIU7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcoaaNEw2BPDxcR8B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugy1OKJlc-JkmZ4FchN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzCOQgM8jWOa4MaGkB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxcp38aM6btPES_LwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]