Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If history of life forms are any indication, we won’t be eradicated by AI instea…
ytc_UgySHkBZa…
G
a rogue super intelligent AI is going to kill us, it's first order of business w…
ytc_Ugz7PXWuF…
G
Funny, all the ads that play during this video are about the benefits of AI.…
ytc_Ugwt_UUoJ…
G
While it is a big bubble. At the same time. The companies that make Amazon’s ser…
ytr_Ugy_qurc_…
G
@fireflametheepik00001 Of course. I understand the issue.
That's why I say I do…
ytr_Ugz_00iZJ…
G
In AI picture generation put the prompt "beautiful" woman. 99% of the results wi…
ytc_UgweRmaxy…
G
I remember seeing a video on waymo blocking traffic yet the passenger wasnt able…
ytc_UgyRsY5Hd…
G
Studies have shown AI will tell you whatever you want to keep the AI going.…
ytc_Ugy8CEmPL…
Comment
I have been driving a Tesla Model Y with AutoPilot and a subscription to FSD on and off for a few years now. I use AutoPilot most of the time that I am driving on the local country roads around my house and usually subscribe to FSD if I am taking any long trips. If this driver used AP for longer than a day or 2 he should have known that AP does NOT stop for stop signs or lights and you have to watch it at all times since it does perform like a drunk new teenager driver at times. Only FSD does this follows road signs and stop light directions. AP works great most of the time (It really is more a lane centering and adaptive cruise program and not AutoPilot like the name suggests) but you have to watch it and do all the stopping for lights and stop signs and such. Bending down to search for your phone and letting AP do everything for you was a very very dumb move of the account of this driver. Not that Tesla is blameless but I would put most of this on the driver. In this video you keep showing FSD (Full Self Driving) and not AutoPilot. Two totally different pieces of software. FSD would probably have stopped at that stop sign and then turned left or right depending on the address he had put into the navigation system and there would not have been an accident. AutoPilot was not designed to stop at a stop sign or make left or right turns at one. It did exactly what I have experienced in real life and I would have had to do the braking to stop at the stop sign. The driver did not buy the high end software or at least was not using it if he was in AutoPilot.
youtube
AI Harm Incident
2025-08-15T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLkAbxotBibd7cxp54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRHf6ENyLqEP_dRet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy4ocmWdhh9fFKdxR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBnLjIlH9s9FPB78t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-LI2OIE09ehGIb3Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyd9Glq6Bt3SkTR8e94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbQJn5bPEhDv9gFzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIodfh6WzQyQkUJ-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuGm2txsj8064zUPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzNKyX1kYaY-Epe-FB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]