Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I always find it bizarre that people seem to be more concerned comparing autonom…
ytc_Ugz1XmMUd…
G
Im a destroy your Ai robot. So im ready for the war if any robot comes around me…
ytc_UgxqihjhP…
G
"You arent being forced"
You aren't but you kinda are being shamed.. if people …
ytc_Ugx0SeyjD…
G
Ai: expesive to produce and to repair
Human: just needs to be able to affort th…
ytc_UgyaPKc02…
G
I am offended by these AI vids and I am white.I have no worde to adequately desc…
ytc_UgwdJQpMc…
G
Anyone saying anything about "not getting their food stamps" is either AI or a r…
ytc_UgzHdWCA-…
G
If AI got to the point that it could take care of us (and itself) completely, gr…
ytc_UgznhfIi8…
G
Man, you are naive. You really think there is no way to test for sentience? Try …
ytc_UgxuMan50…
Comment
Your analysis sounds fine yet it does not capture the essence of the dilemma. If the neighborhood car makes a sudden turn, it may affect other cars, not necessarily but likely. There will be a cost then. And there will be a chain effect for this incident, and the cost is hard to estimate. The essence of the story is that any emergency may pose a decision-making dilemma (may not be fatal but to a certain degree) for the AI, who would find it hard to reconcile between efficiency (for what it is designed) and moral requirement.
youtube
AI Harm Incident
2024-03-25T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyF6F5mYMV-Pa5y5GZ4AaABAg.8hdOXOKKyqR8kbvmeVyUHC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkhoJUXCOdqrrdO_F4AaABAg.8ZuDxkqG2qH98bK0NRx2AW","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgwE0M-nX5hQSlgJMIt4AaABAg.8ZLDyabu4ux9mLu7ZsOo7e","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-Fc9mLtvosPf4W","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-FcA1PIA3250ul","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8NwFFi8Pqfy","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OAIyM-R3lw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OASggjEx2E","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NolwAfwMh6","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NpIBpJMJ3W","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}
]