Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robotaxi won’t be affected unless FSD genuinely does crash. In this case the dou…
ytc_UgznN9-Zt…
G
So an AI accidentally deletes an entire project drive. Or maybe a human gets a c…
ytc_Ugy2y-etH…
G
You know i just think we should not do AI at all. I don't think all this electro…
ytc_Ugw8hHVqn…
G
I earned my Bachelor’s degree in automated manufacturing all of the way back in …
ytc_UgzWc6KB8…
G
People think they so special.. there is nothing about your life that millions of…
ytc_Ugxyh61nj…
G
I asked Microsoft Copilot. It says Saagar is wrong about the Russian revolution.…
ytc_UgxpdWYYv…
G
Imagine everything humanity has ever done. Now imagine everything that humanity …
ytc_UgzMn0mcW…
G
the way china is heading with AI I wouldnt be surprised if they turn out to be a…
ytc_Ugw0chO-o…
Comment
Currently Autopilot is still work in progress. Before any Tesla owner can access autopilot they need to sign an online form knowing that Autopilot is still in Beta, and they, the driver is ultimately responsible. Based on this I would never let Tesla Autopilot have full unattended control. It seems the drivers who killed the motorcyclists were distracted.
youtube
AI Harm Incident
2022-09-03T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxX-Ayn7yUJznfc0t54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-PlPlFxZMlrKl7L14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvDnbzXfa5FBoo7Zl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgOExhv5yZa_XdtpB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyoednUGu41_yo24md4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwANF7DrFOhKxikA-R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8e0l4Nu2l9bYutH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxG7tG4EnJHJkTtht4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzErz4Be_uJzuUAsAh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzncrNhH5J6ZBBj3N94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]