Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn’t just a job threat — it exposes how the system already prioritizes profi…
ytc_UgxvwZX5Y…
G
That line "Art is hard" anger's me so much like yeah we know its hard but you sh…
ytc_UgztWVN_h…
G
I asked AI. Who is in control of AI. This is the response 0:01: “The control of …
ytc_Ugw2GUsHK…
G
AI would need access to “functional” emotional centers in the brain in order to …
ytc_UgyNkw0Xv…
G
A well known issue with ai is it just making things up that are not true. It mak…
ytc_UgxECXR-7…
G
Honestly none of this makes any sense. I think the real plan is once they automa…
ytr_Ugyc_Hu6E…
G
You talk of amoral AI. But that's not the issue.
The issue is IMMORAL leadership…
ytc_UgxqiC4XE…
G
I studied LLM's, and I would love to go back to when they first started making n…
ytc_UgwRJFd45…
Comment
I’ve been saying for years that these companies should bear the responsibility for when their products cause auto accidents. The major concern with self-driving vehicles is that they are programmed to prioritize the safety of itself and its own passengers at the expense of everything else… hence great danger to bicyclists, pedestrians, and other vehicles on the road. If this decision makes some of these companies rethink their programming and product testing practices, it’s a big win for society.
youtube
AI Harm Incident
2025-08-16T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6jc29xY9Yu2xo1Rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzi1ZaK1Gdma3Qj8xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCN1JAymUM7Ha6oyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgynHWVGjsKdUpzhk5h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTsi5fn5x6sSux5-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgywHYWkZw-SZpDbkrB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyQEC0FaXVdGxop7iB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw066AApXNjFtxCJ7J4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyb5ol4Qvmde5uSVCJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxUDZ4Yg-Nglm6wUTh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]