Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your engagement! If you're interested in interacting with advanced…
ytr_UgzDrCxD5…
G
Sure AI can do good things; Until it becomes So powerful that it’s takes over, …
ytc_Ugw6tsPn4…
G
Only Waymo the self driving car 100% automated that is worth every single penny …
ytc_UgyolnCWd…
G
In 2003 the US invaded Iraq. Right after the discovery of the tomb of Gilgamesh.…
ytc_UgwOXLhHb…
G
GOVERNMENTS will destroy the planet before AI gets a chance to try so maybe star…
ytc_UgzY7ktPj…
G
@FreakingRockstar101 Oh, I don't worry about it, I know I won't be left behind b…
ytr_UgytwXmHD…
G
"Embracing AI as a Collaborative Ally: A Path to Entrepreneurial Empowerment"
Th…
ytc_UgyJDT4a0…
G
Add a delay to the file monitor that is random. Make sure it’s not easy to spot …
rdc_hsf38gn
Comment
This Tesla beat up is a completely predictable take by mainstream media. The systems in the car warn the driver that they are required to maintain supervision and be ready to take over at all times. You don't need to read a manual to know that. Of course, there have been sad cases like the ones highlighted where tragedies occurred while Autopilot was engaged. What the story completely ignores is the numerous crashes that are avoids by Tesla's self driving technology. You can find many cases on YouTube. In the US, the rate of crashes in Teslas using Autopilot was 9.5 times lower than the US average. So if you really want to make the roads safer, you would want more Teslas using self driving technology. They don't get distracted or drunk or tired and they are looking in every direction all the time.
youtube
AI Harm Incident
2025-10-20T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxVSRevNfpEsXy-Kah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygwhEpahYKg8OZLLB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCtcsEgGjHLJT5PpF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpfcXZ1cVY8T8WN7V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaEv9AjrNXYNE10NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj5zO0aDUmpAxCdEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZVLm5WJguL8LuB9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxiM8DVfMbWsCvF79h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvFkngF8l8gB30U4p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6gOyk_7dZLjcExOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]