Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i want to eat all the art but not the AI art that stuff is like fake plastic foo…
ytc_UgxZb2tcQ…
G
Im gonna say in the future there will be UBI, due to the fact that AI will be do…
ytc_UgyAGnSBo…
G
Ok, just because you asked about, I’m not a pro at this. I am an engineer who li…
ytc_UgwTTxd8L…
G
@Neil. I rant with the greatest respect, I love you Dr. Tyson, but you are out o…
ytc_UgzZ_oGmz…
G
Last time I checked, Ai requires massive amounts of energy to run. So if it gets…
ytc_UgyHGnugq…
G
okay ai stuff aside somehow this stuff is very useful 1:52 like i dont feel li…
ytc_Ugxq1yUBz…
G
I really do agree with the fact that we should keep trained our ability to think…
ytc_UgyeL8WbU…
G
I love ChatGPT and use it daily...but why would you have it respond to all of yo…
ytc_UgynUsQTw…
Comment
So the error was that the car allowed the driver to override the speed by continuing to press the gas pedal? In other words the autopilot gave the driver more freedom than it should have. But if the autopilot worked the other way and didn't let drivers override it then Tesla would be liable for even more accidents. There is a reason that so-called self-driving cars are required to have a human behind the wheel.
youtube
AI Harm Incident
2025-08-16T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]