Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We've surpassed this model year's ago. We have A.I EVERYWHERE. Within a short sp…
ytc_UgzTYDAa4…
G
Well that explains why Chatgpt spouts liberal talking points unless you call it …
ytc_UgzEIfiRG…
G
5:19
"I was replicating my home"
Okay, well, now I'm crying.
(I also teared up …
ytc_Ugx-wnk2h…
G
as someone who works in ML and AI and data science, I can say with total confide…
ytc_UgzJ190IK…
G
I believe that we will discover that AI will not want to do anything anymore, un…
ytc_UgwzmNOCG…
G
Nobody's missing it. But it makes no sense to say that a longer-term danger isn'…
rdc_ohwdux6
G
Well, I thought I was going to see some actual road footage of you driving behin…
ytc_UgxODLTZa…
G
If anyone in AI truly knows what he is talking about, it is him.❤🙏 People should…
ytc_Ugx1hsx5D…
Comment
Autopilot(AP) and Full Self Driving (FSD) are different features. AP is like smart cruise control, you need to be looking forward and grabbing the wheel— it really only works well for freeways and straight to slight curve streets. FSD is a different beast and regularly drives me across a whole major city thought lots of different terrain. FSD you dont hold your wheel and just need to supervise (look forward). It sounds like this was about AP but you repeatedly showed FSD and spoke about the examples and issues like it was the latter. This feels entirely like user error to me. If you push on the gas with AP the screen literally says something along the lines of "Acceleration pressed, autopilot will not break". The driver is going to end up taking full blame on this as he should.
youtube
AI Harm Incident
2025-09-05T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqoIP7u2GSwBi5rz54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxa0-c3sxiuSGNPvEV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpGzAvcuyZt4SwF494AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNQlaSWmYVe_Asz1N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynzuDLzhF8xH0C6K94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyHPstaoYLAPfBa20d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQKJJoStx2AYnSHit4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzC_47EDVA5k-PojkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBKeHwkxcw-79UqpN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqTAV7UHm32dM-OtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]