Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For some context for those unfamiliar, in Asia it’s becoming increasingly rare t…
rdc_ogpvuf7
G
Every time I explain this, the person's defense is always about how real artists…
ytr_UgwI_zqtD…
G
We need different options. But not seeing huge damage on over investing AI. Yes …
ytc_UgxYthi-h…
G
There needs to be laws restricting how much AI assistant bs companies are allowe…
ytc_Ugyp-yocn…
G
You know.....let the capitalist and there bootlickers do there last big investme…
ytc_UgxRtOvpw…
G
The Evolution
While most people fear the creation of artificial intelligence as …
ytc_Ugw5BUFc8…
G
So, with the Internet Archive losing their appeal the other day, I think it's im…
ytc_Ugw62QKvA…
G
Uh, John will say that I'm being unfeeling, but pressuring AI companies to curb …
ytc_UgyCG4lla…
Comment
Look, I despise Musk, and anyone being permanently injured and/or killed on the road is a tragedy. I also don't own a "self-driving car" nor have I ever owned a Tesla. That said, if we compare the accidents humans cause versus the number of accidents self-driving cars have caused, the difference is astronomical, even factoring out the percentages of said vehicles on the road. Not only this, but a self-driving car never drives drunk, never gets into road wars with other cars, etc. Driving to work, you can see dozens of people on their cell phones. People can't seem to put down the phone, so I think we need self-driving cars......
youtube
AI Harm Incident
2025-08-16T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzY9pmELx64ghMC_wV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLoINY9kFe03x5mZl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyN1EBnNyibJ7O40DB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_jWCDbuSaRQBAPBh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwkZkk1SpBCdpTvToZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugz6chf2timwKrLqckJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRD7zfqtNLq9FR8ux4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8dm3JWe9ATDoddep4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHwNntf2AVGseRYtp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRV6RhMObFzKd5x5t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]