Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder one horror story idea that lives in my brain is the idea that an AI has…
ytc_UgyZrzAfq…
G
the ai was acctually trying to save him
edit: thanks for all the likes (if you…
ytc_UgwnhXj8S…
G
It’s the fact also “u can be just randomly on the phone “or just speaking to som…
ytc_UgxHVhHBL…
G
AI right now is a tool. I imagine the human mind as a tool designed to simulate …
ytc_Ugw-JF_zk…
G
Let’s make that company!
The Spotify of Voice actor AI models.
Market place st…
rdc_lgtgnhx
G
All cars in Europe are designed to ensure that all the doors will automatically …
ytc_UgzxO_TZY…
G
AI art can be useful if I want to envision a scene from a book or character in a…
ytc_Ugw76mv8G…
G
I don’t think we realize how big AI can be in the next few years alone. It’s in …
ytc_UgwJuvNJf…
Comment
OK, this is left wing hog wash. Tesla's don't purport to be self driving cars. As the first driver said, self drive supervised. Too many people who have had bad experiences have not been supervising or doing stupid thing like sitting in the back seat. No where in this report states what version of software was being used. Version 13 was a step change in performance, driving better than most humans. Version 14 is now better and doesn't mess up on any of the edge cases shown here. Stop going on about Lidar as it is lame, just adds noise to the AI input. I don't have laser beams shooting out of my eyes. You need to interview people who know what they're talking about not just people who are being bribed by Lidar manufacturers or big oil or just have a gripe and with Elon. Most of these accidents were caused by idiots and many of the Tesla's were not properly in self drive mode. 60 Minutes needs to do proper investigating. I'd laugh if Tesla sues you for deformation.
youtube
AI Harm Incident
2025-10-19T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxyeb2cfbNbPTXmbTd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2FKdjndHryBhPM-t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-OWEarLxiOJeQH9R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNv2xqN6w5xQNGwGN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIYLFnJEWR1gXEQpJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2jI5CfQG2X1Olmrd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwkh6nHxv7j4ZpBAWB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9M5PfWA967r6yXXV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBBc00Y_99VVXhBKh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyQFJkjuksOJJCfq8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]