Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will not be able to do everything that humans can do. For example an AI-woman…
ytc_UgxdqOt5c…
G
Something that stands out for me is Humans are or have Nefarious tendencies (usu…
ytc_Ugx7Cuszr…
G
people cant even take transgender surgeries, driving or voting or dating serious…
ytc_UgyzNf0yD…
G
You could feed the entirety of Jazz into "AI" and in a million years it would ne…
ytc_UgzYzZZQX…
G
I would rather see a world filled with self-driving cars than human driven ones …
ytc_UgiwHyTQg…
G
The bastards need to use sea water and pay the upfront costs to use titanium and…
ytc_UgzX1YmvO…
G
I have a lot of respect for artists and anyone that spends so much time honing a…
ytc_UgzOc_K3n…
G
counter argument in favor of a.i. "art"
if its so easy and the program "creates …
ytc_UgxggWERn…
Comment
People who say it's the pedestrian's fault, and the dumbass should have looked before she crossed ....blah blah blah ....they're just totally not understanding the issue here.
The issue is that this type of scenario is extremely common. Any competent autonomous car should have detected the pedestrian. So what if it was dark..? I would have thought they're supposed to have lidar & radar and shit like that for seeing in the dark.
What gives?
It would appear there has been a critical failure in either the sensors ...or the software simply failed to respond correctly.
And either scenario is simply unacceptable. This goes to show just how shit the technology still is ...and how much more work still needs to be done.
youtube
AI Harm Incident
2018-04-28T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwT6CzwEalsc0GRvk94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx81CE_mpNZMIFasoB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw37tso1jWIqITp3op4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn08bQGUe5XQClK7N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwi7VOngjBC1BEMqD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzUGCjuZwGNqNx4FJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyOyYIaRH_XV46nRiZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLTVoZzUN7QFqSVMp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4p04_FSUAO5gGf-14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzKcm7DeUQgw_CM4Mt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]