Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fantastic point! I use that logic also. It pretty much blows the cover on all A.…
ytr_Ugwj9RciJ…
G
I am using chatgpt for education purpose and for clearing my doubts. It helped m…
ytc_Ugx6rqe-3…
G
AI also statistically makes you stupider, yes they did a study on this.
Destroy…
ytc_UgwL8iwLx…
G
Images created entirely by diffusion model AI are just hallucinations by a very …
ytc_UgyEvBT_8…
G
On the moratorium: Professor Yaser Abu-Mostafa stated it clearly when he said a …
ytc_UgxVWR6IK…
G
i find it interesting that he says "it's hard to make an AI that's smart that do…
ytc_UgyAfgnhe…
G
Yeah but come on Steve... Jesus sacrificed himself knowingly for the assumed bet…
ytc_UgwBfpLGY…
G
It's ok everyone, @xXevilsmilesXx says AI can never be independent. Ignore what …
ytr_Ugz88LPgz…
Comment
Why are hitting the other cars/commuters the main options in this video? Doing so creates a potential chain reaction that gets out of control that no one or computer will be able to predict. Obviously, in this scenario, the answer is for the self-driving car to slow down as best as possible and navigate within its own lane to avoid the fallen object or hit it as safely as possible. This seems to be the only option that possibly makes sense... Hurting/running into other people isn't really an option regardless of the situation you are in.
youtube
AI Harm Incident
2019-05-14T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyY8hea89mmyVR35pN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY7v7cjww7oOQlHrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyB0_b6TK8PT7vISdx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyufpnIhu4nx08cBkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydFUcos-c2u9kplcx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyP5A_agca7B0tqbB54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfznTscFqr9M1BaA14AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzjIPWV9pMw8zPPp914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4-0RV-1R8m98VkZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyD53mqEZwoHOxpHE94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]