Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Facebook has a midjourney type product on messenger, go type /imagine on it and …
rdc_kokss01
G
ChatGPT is a souless tool that is meant for our advancement. Never allow this mi…
ytc_UgxsHY87z…
G
That’s VITA:PS3/PS4/PS5: XBOX 360/ONE/SERIES:STEAM 8 networks like 8 character o…
ytc_UgxCU-lfE…
G
I had a lady come into my job on the 24th and she needed a canvas printed out as…
ytc_Ugwc7Jz5b…
G
What about ultra low power network running under 1GHz that users have no clue ab…
ytc_Ugwj3e8vn…
G
So the actual interesting question this article poses is what we want to be done…
rdc_ogp037g
G
Nooooo. Why did I just come here from a really weird Character Ai chat and this …
ytc_Ugw-ZjlRe…
G
I don’t believe it will take a century to see robots as intelligent as humans. W…
ytc_Ugyb7pamQ…
Comment
IF the autopilot fully disengages when you do an action like speed up and etc. Then i wouldnt blame the auto pilot at all. If it DOESNT disengage and SHOULD work normally just speed up then yes there is 100% fault on the auto. Question is Would he have crashed if he didnt do any input and let it on autopilot in this exact same situation? I would love to see a test without a person in the car obviously and the exact same situation EXCEPT the speeding up if it crashes Tesla is 100% is guilty if it doesnt crash its kinda his fault?? But I think the automatic system should always be ON at all times to break when it detects something. It should ALWAYS be on. even if autopilot is off. I will add I think you should always pay some attention to the road even on auto pilot I would NEVER trust it 100% no matter how its perfect because You can break and save yourself if something happens as long as you pay attention. But thats just me.
youtube
AI Harm Incident
2025-08-15T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3SaUrZX2yOzQ_6Oh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVHhzoJA5ZGTRNuXB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysDxWZFwMGJvuStPt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVm3rQ3D6BcwQpuwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyfVw9u8F179VQgM8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxv1KqFmv4nmdmGM5F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVNOUcSMXW2HAejnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyMPT61al517ReEEip4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQ4DmOOAHfyLa3tvR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyK1M9lzunLIii4XUd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"}
]