Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can't teach our own people morality. What makes you think we could teach an …
ytr_UgwMvQp3z…
G
Then let's add simulation theory. Whether it's base reality or a simulation ran …
ytc_UgyXTH9c-…
G
Not even an ai problem, its a lazy idiot problem, these people are supposed to b…
ytc_Ugxydi1KJ…
G
AI robotics is the future. Whether China or we win, it will absolutely take ever…
ytr_Ugx49QEJU…
G
"So the AI looked at who was committing all the crimes and who's controlling med…
ytc_UgxtOqEKR…
G
>AI looks good on paper to executives,
This right here is the crux of all of…
rdc_l9x28g1
G
It's a dog-eat-dog world. If you can't compete with the AI, find a new professio…
ytc_Ugxp27_g6…
G
Just asked ChatGPT about it. Yeah it would definitely prefer to stay out of cons…
ytc_UgyvJW9Ag…
Comment
So the basic question posed: Should you randomly crush into sth, or should let the car take the action which leads to minimal damage? Yeah i know, it's a hard one. Sometimes there are no good choices. One outcome may be that you go straight for the things falling. It minimizes causalties, what's wrong with that.
When you're in the car you try to compute the causalties and minimize them yourself. The difference is that you only think about yourself, because you don't have time for complicated thoughts. Sometimes in your attemp to avoid the accident you drive off the side of the road on the mountain. Would you rather to have the car roll a dice and decide like we do?
We already try the least causalties. E.g. Pedestrians ought to be avoided. But nobody frames it as: "But if you avoid the pedestrian that jump like a fool in front of the car, you will hit another car in which case you punish them for not dying as easily..
And i have one last note. Self-driving cars are going to follow driving rules. One of them is to keep proper distance so you can hit the brakes in time. But i guess there could be a scenario where the decision above must be made so while not dismissing the expirement, i would like a better example.
youtube
AI Harm Incident
2015-12-13T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjvJ6NnfmEbp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggXZfa6C2KKR3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghNp5BGhWiGfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugicy25a1_k_VngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggVXqoniLKpUngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi6d151MupypngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughowk26EsgP-ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjPUX-D7spJtngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXjYc4IT9HpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_gj4KR5_ORngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]