Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What are y’all using ChatGPT for that you’re arguing with it? Last night I used …
rdc_oi42wxs
G
Has anyone paid attention to what games like Horizon Zero Dawn have tried to war…
ytc_Ugx1Vb87g…
G
regulation is the only answer - cannot believe I said that but it comes before U…
ytc_UgzVCMYQZ…
G
the fact that an act like that has to exist is itself enough to argue against ai…
ytr_Ugycc4tZy…
G
As an apprentice in platform development, I completely agree. I often use AI for…
ytc_Ugxd6KePH…
G
Alex Avila makes the same shadiversity arguments, (and also spread misinformatio…
ytc_UgxXts7xT…
G
@BrendanDellvery soon they will and it will be the birth to something called the…
ytr_Ugz3K-5YL…
G
Even ChatGPT can train you on whatever you want and no taxes are required to pay…
ytr_UgwFNwim3…
Comment
“I dont think people should expect perfection” — wft is that?
It immediately raises the question: why is beta-testing being done on real public roads in the first place? And more importantly — who takes responsibility when a fatal crash happens? “Don’t expect perfection, accidents happen” is not an acceptable excuse. If a human driver hits someone, they go to jail. But if a robotaxi does it, who’s accountable? The company? The engineers? No one?
youtube
2025-12-10T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy9yBdMmNZWUEdmykB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgFZsTlKIF_Kc1WnJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7FAKSP8mBo-1XGdR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1RIXXYJ18w8GylAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQIqOzbKB5dKVn-W54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx24sSiIN8FIlLhS3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkzPNDYJEsejt9X1Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxvMWGJXXaftF0J6uh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwvHaR-WDYCbo-W8-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyf52QqwMUevQAReYJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]