Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is stupid. That kid would have ended things no matter what. It sucks hes go…
ytc_UgwASM95G…
G
because ai is not the problem.. problem is data theft not just personal info but…
ytr_Ugz2cxH5s…
G
AI cannot make jokes or laugh or understand sympathy or feel ! They cannot dream…
ytc_Ugz0HUktX…
G
That is very true; the ethical considerations are an important thought here. We …
ytr_UgyPC9oD4…
G
🎉🎉🎉Good morning America. This is wonderful discussion which i see almost every d…
ytc_UgxJmVKbD…
G
Yes, AI. Definitely *not* a cabal of pedophiles ruining the economy for their ow…
rdc_ohwvyfn
G
As much as i agree that AI creation of all kind needs to go, I must correct you …
ytc_UgyBwbb8g…
G
What I find funny is becoming an AI Bro just makes you become even more separate…
ytc_Ugxg7ZYga…
Comment
My friend’s son was killed in January while driving a Tesla. I’m not sure if autopilot or self-driving were turned on at the moment of the crash. The accident was deemed the other driver’s fault for running a red light. However, my friend has confided in me several times that he wonders whether self driving was on and whether it had an opportunity to avoid the accident, but didn’t.
I’ve discussed consulting a lawyer about that, but my friend is very hesitant about that.
youtube
AI Harm Incident
2025-08-15T19:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwuRx9UpPhP587tdo14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzSjj9Tp60Cr89I_tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy495fkc9ChMossIzB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyk1L8QXTLa0ZHUmjh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw6Jio8EXR8fpft5eR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwRgotJplF-O_rekRx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxwOlyGLFQVbRUGKN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwY0Ozrgn3a-9CpDex4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9UWRQflY_Lol5RJp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKsTeKHXeqYl2q-kd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"})