Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Eyes and mouths (especially while speaking naturally) are where AI still struggl…
rdc_ohz9ist
G
Could be?
Is, Phantom, how much of a threat AI is. Because here’s the kicker- …
ytr_UgwYJAAKt…
G
@Mr.Drew70 it's not really AI art, it is more like Prompt Art, the AI is just g…
ytr_Ugy8Fx_B0…
G
One thing that I don’t understand is why artist crash out on twitter because som…
ytc_UgxOwG2nN…
G
It'll be extremely interesting to see what advancements are made because of this…
rdc_fjzhqg8
G
I think we are going to pull it off. Humans and AI together. If we are alone, we…
ytc_UgxazIEI7…
G
AI's evolution is thrilling! Pneumatic Workflow lets us bring these concepts int…
ytc_UgyXZvqKQ…
G
I think his experience at OpenAI is what allowed him to recognize the potential …
ytr_UgwQcystQ…
Comment
The fundamental problem here is that Autopilot is a single entity. All mistakes in one car with Autopilot are present in ALL cars (with the same version of software). If we put humans under the same scrutiny, how many of us would be perfect under any conditions? When a human makes a mistake and kills somebody on the road, it is an "accident". We don't even attempt to fix all humans to not do that mistake again. Now, when we have this single entity, the Autopilot system, under such scrutiny, we will be able to fix it beyond any living human's capabilities with time and then at least rely on it to make predictable mistakes. That is the real benefit of automated driving; make it more predictable. In the current mix, we need to realize that accidents happen and will happen, no matter how smart we are trying to be and no matter how much lawsuits are thrown against car makers. It is the engineers that will fix the problems, not lawyers.
youtube
AI Harm Incident
2022-09-16T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxHrCI0uHIqia0hz-J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoJfSZ3NurWPRytpJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzqCU3ivdbiCdo8MJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPLneU_YBeBgW6gTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxkM5h3_URiw7lZ2nt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsYd5MwpJOVPG0NYR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyN_WHQ7Xe-a7UFhB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxdS4gDGscj05gH2id4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0B_TNxRfQkZzUon14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxxkB_ABeZtqWqGF-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]