Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ridiculous.. MIND is not self. KNOWLEDGE is not identity. MY loved ones are not …
ytc_UgwpmE9TJ…
G
I confess I treated the Geneva convention as a checklist in character ai also u…
ytc_UgyXvO0Zi…
G
Absolutely horrifying.
These scientists may mean well.
But they forget they d…
ytc_Ugyuwp3wT…
G
We are literally living in the prequel to The Terminator. Who will be our Sarah …
ytc_UgwYR1vPD…
G
Aaah... but here's the question: How do we know Hank isn't a robot?
Could someo…
ytc_UghTmXF35…
G
Too many contradictions in your own speech, sir. Neurolink is a hard sell to tho…
ytc_UgxiUpmNa…
G
He literally said, "only thing that can stop a bad guy with AI, is a good guy wi…
ytc_UgxSGGrbB…
G
I don't like Ai art, Ai shouldn't be used to make art, it should be used to help…
ytc_UgzPqB4yQ…
Comment
When the justification for a crash is "the computer was not trained on this situation", it just means your tech does not work, period! The roads are full of various events and situations and you will *never* be able to train any kind of AI on all of them. Not all events are life-threatening, but all of them should be learned anyway. A Lidar or a Radar would have picked up an obstacle on the road, no matter if it knows what it is and stopped. Relying solely on cameras and a software to identify something as an obstacle is incredibly dumb and can only lead to more fatalities. If musk tells you otherwise, he is a criminal and the blood of all these victims is on his hands.
youtube
AI Harm Incident
2025-04-10T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgypwK1qm398cYaqVQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugysm00SPxqpDXYN0-Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwzszctqPTvowV1Urd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxRr_3CNZSU_TKpOkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxxqjI95JtdpoBMFdR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyzHoayWj-2Woi-YuJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_Ugz_YC2h1CpwnhxQEfp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyJeGOCuy7qj7U3Wt94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyIDdCyXQw9ZAN0Gcp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwZKukkmp-_8m8Exel4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"]}