Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon is a huge bullshitter and he is going to teach AI not to lie?
Same guy who …
ytc_UgzD1mAyh…
G
*I miss when my grandma used to sing me lullabies with countries nuclear codes, …
ytc_UgwOJIHv_…
G
I thought he was going to sell an AI course, but it looks like AI has already re…
ytc_UgxXVaGBp…
G
No, the issue is the plain english will need to always have a translation layer,…
ytc_UgznTmolZ…
G
*In other images of a text conversation online, one user declares: "Long live th…
rdc_dlge2hw
G
Unfortunately for them, my rates have doubled when they call back after laying m…
ytc_Ugx7SYC4p…
G
Real video evidence gets released and the criminal screams "deepfake", this is w…
ytc_UgwGh7qCV…
G
Humans don't need electricity, internet, AI, or money. We've lived for many tho…
ytc_UgztXG4RC…
Comment
"Hundreds" of accidents over the past decade sounds small. Tesla and other automakers that use similar systems are in a no-win situation. Even if vehicles with Autopilot and FSD are safer than those without it, there will still be accidents. And if it could be shown that these semi-autonomous systems should have been able to avoid some of the accidents, people will claim that these systems are unsafe and should not be used. I don't own a Tesla anymore and never will again, but since my first Tesla with Autopilot in 2017 to the 2023 Tesla with FSD that I sold this past January, the difference is night and day. You've always needed to pay attention to where you are going, especially with early versions of Autopilot. But there will always be people who want to play on their phones instead of properly operating the vehicle that will screw it up for the rest of us.
youtube
AI Harm Incident
2025-04-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgypwK1qm398cYaqVQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugysm00SPxqpDXYN0-Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwzszctqPTvowV1Urd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxRr_3CNZSU_TKpOkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxxqjI95JtdpoBMFdR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyzHoayWj-2Woi-YuJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_Ugz_YC2h1CpwnhxQEfp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJeGOCuy7qj7U3Wt94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyIDdCyXQw9ZAN0Gcp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwZKukkmp-_8m8Exel4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"]}