Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> replies in this thread are depressing.
Not least, because after 5 threads …
rdc_dcwk1c0
G
When you say, that the robot will mow the lawn. That's all I need to hear!…
ytc_Ugwzt3ZYI…
G
We do not even manage to ensure Human Rights, not speaking of Animal Rights. So,…
ytc_Ugx6ldpxb…
G
I made Skibidi Elmo out of the prompt:
“Elmo with a long neck peeking out of a b…
ytc_UgzTYpxE3…
G
AI will not replace software engineers.
However, it will improve the productivit…
rdc_ku64xif
G
Ai would never be able to show feelings like humans can. It wouldnt be able to w…
ytc_UgxfRPxff…
G
Seeing a scout use AI for a scout task at a scout meeting and immediately saying…
ytc_Ugy1DkTq1…
G
Don't give up don't let AI ruin everything you've worked towards as an artist yo…
ytc_UgzsoVnc5…
Comment
So the driver of the Tesla seemingly had multiple actions being done by them that seemed to hold them at fault, 23 other instances that validated the individuals inability to pay attention while driving a vehicle, but the driver leaned on the reality of the auto pilot needing to be held accountable?
I mean, am I allowed to play with my phone while driving a self driving car? I’m sure thats not the case.
Also… choosing to speed up when coming to a stop sign? Why? Is the Tesla gonna fight back to me if I speed up going to a stop sign?
No?
So I mean, the driver still needs to be aware in my opinion and should 100% be at fault. That’s crazy.
youtube
AI Harm Incident
2025-08-20T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYoH8BUWQJNUbcfjt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2EZU9RNuRziAjnX54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzC350sEEiXQTBXjdN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwI8lQp29mHILyBN8d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzl_F01hE4nghXjmEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxyjOF89fd51WvwZEN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8RZpce6dWScg_jFR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-uJCgijbOOtdg0Fp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwTeD0eeCC6zT4B4yl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVtu70xpEPgD6OnpR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]