Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Waiters are already being replaced 💀 so many restaurants and grocery stores are …
ytr_UgyhK9T4i…
G
he tried to free you guys from enslavement of capitalism. chill guys , lets a.i …
ytc_Ugw8Z2u87…
G
I think 🤔 let them create a digital world for themselves and let’s see what beha…
ytc_UgxEvAE-6…
G
What an abomination to God Almighty. It is so sad that we
are now asking a robo…
ytc_Ugx6rO72w…
G
Nah real sentient AI would be really cool and scary, right now, it's just really…
ytr_Ugxq8Hmok…
G
Seriously concerned on advancements in artificial intelligence and the governmen…
ytc_UgzkiKUbK…
G
Ethical??? are you kidding? EAch interaction with human make AI better and bet…
ytr_Ugxrg8bdb…
G
@mimimoomoo2902 What does that have to do with whether AI is theft?
But fine,…
ytr_Ugxu7mJnV…
Comment
The biggest safety issue in every car is the human driver.
Teslas "Autopilot" is only a driver assist system. The Model S is not a self driving car. The human driver is clearly responsible for this crash.
However automatic braking is not new and should have worked in this case. The radar should have seen the obsticle, while the camera failed to do so.
Mercedes for example has automatic braking for quite a while and afaik there haven't been any reports about failures like that.
youtube
AI Harm Incident
2016-07-02T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0qtypp6SCITTSqRl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznRgyu8fu1bQYeY8h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy1U7irBqxMzWnBSYZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTorq7V9o2sLu92wJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0VnwSlCpl-Pfc78h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1lzNSE5QEBF1gho54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlDKVGN9w4WFpMXWF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRrdN4ObLi77vdRb94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwjE1GeJpxUTR3uf9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiRdMUEklJtFXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]