Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've stated from the beginning that AI will be as wicked and decietful as the hu…
ytc_Ugw5TCS-l…
G
Because for some reason everyone thinks modern AI is actually omnipotent artific…
rdc_n0pu7tt
G
After 10 minutes of conversation
Robot : "give me your clothes, boots ,and your…
ytc_UgxQwNGon…
G
So even with more whistleblowers stepping forward, google and even likely its in…
ytc_UgwgK5qZI…
G
I play wind instruments and I get downright offended when I get that, I just pla…
ytr_UgyUQFZtf…
G
Thank you for sharing your thought on the video! The interaction between the pre…
ytr_Ugzn0x4YP…
G
Reality is not that simple. LLM do not work reliably alone. They hallucinate, ma…
rdc_mxy6dr8
G
Would be funny if the AI from Customer side says there is nothing wrong with the…
ytc_UgyIoFE1K…
Comment
We have also kinda programmed ourselves what to do in situations like this. We know that we should avoid collisions with objects and cars, but given the circumstances we decide how to act. Perhaps if i will see in the SUV someone in the right side of the car, I would prefer getting hit, but if the driver is far from the edge of the car I would rather hit the car. One can program a computer to make not specific desicions, but to consider the circumstances at the moment
Edit: I wrote this without watching the entire video, but the motorcycle problem will not be a problem for a car. A robot wouldn't think ethically about "punishing", but try to maximize the chances of survival for all sides, so choosing the helmet wearing motorcycler wiould be the "correct" choice, or perhaps the computer would prefer getting hit because his chances of survival are way better
youtube
AI Harm Incident
2021-06-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyHUa4MwqTkhowKM4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHkU5bsdw5P7-8JZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdBJyQxP861oljRU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzgl7iBXRe5tFYLHYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUKFC8r1V6HNUTydd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxFxLtLEjGce9wq8iZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxI7TNMt_TdEOfHQpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6WKiqU09QhjUsxKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCgvj2u0e_RN985ix4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxaV7FZRp4oz9oNI6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]