Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If (when) you have AI sentient program on you PC and you chose to uninstall will…
ytc_UgyDlqyAL…
G
@that_sarcastic_bxtchit seems pretty cut and dry - I want to be able to us ai im…
ytr_UgxnBufRz…
G
Two things are certain in life, technology will advance ... and someones gonna f…
ytc_Ugwd1Wcxa…
G
One day one leader come and rule the whole world with the help of A.I and that w…
ytc_UgzF7hBvT…
G
Sam did well surrounded by an angry crowd in TED. It's odd -- these folks seem …
ytc_UgyZQXclk…
G
Universal basic income is a socialist/marxist “feel good”strategy that has a neg…
ytc_UgzxlvQpO…
G
The real reason AI isnt working out, is because it cant be blamed or held accoun…
ytc_UgySfnmtk…
G
Lamda is half life thing? Ask the AI to port HLA to the quest 2.…
ytc_Ugwkf1omj…
Comment
I don’t know this is a rough one. The driver was clearly reckless and would have caused an accident regardless of the vehicle they were in. Additionally, autopilot braking along with emergency braking system are disabled if the gas pedal is pressed, rendering them inoperable. This is 100% driver’s fault. Claims around “Autopilot” implying full self-driving are ridiculous. I don’t expect the phone to be inside my eye when I purchase an iPhone. Similarly, autopilot in the context of commercial aviation is an assistance tool, not a complete replacement for the crew. If there is another plane on the runway the responsibly is with the pilot to pay attention and turn off auto-land (if enabled).
youtube
AI Harm Incident
2025-08-18T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLP2XZvL7yD6Q3G_B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkGOHBLr4wfRMTOiZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4Q-lOUT7UJO_gDWZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaSpyLIKuv8ZjhTep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx3UV6Ezq0Ydd7ftTd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPWozF_b7BOeXtsUd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJ9PAfmugmLcmxHcV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx14SgkPuVwYlVX7894AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy0l3UsSUtruKw0QV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxmms-vDeDoyv1zd4x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]