Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes exactly, it drives me nuts. Who would have thought I’d long for people’s spe…
rdc_my4liki
G
When an AI agent commits a crime, or is directly implicated in a crime, how shou…
ytc_Ugzq4gwaU…
G
Get the AI creators to pay tax for using those systems every robot takes away 10…
ytc_Ugy3OarVW…
G
This video misses one very big point:
An AI can have multiple bodies. All of you…
ytc_Ugj9Z8KXX…
G
22:31 *sigh* Yeah, this man is done.
24:00 Pro tip: If you see a potential tool…
ytc_UgxVYIsqr…
G
right now AI's main effect on society is degrading human intelligence. it needs …
ytc_UgyT7-8qI…
G
Keep at it! I know it can be discouraging but I encourage you to try new things …
rdc_latzxhz
G
I don't trust these benchmark reports, they might have gamed a few or something …
rdc_nshfkuz
Comment
@ronald3836that is a very balanced and level-headed way to handle this.
In fact, it's probably the best possible way to handle it because then the real issues will come to light in the courthouse.
The driver could sue Tesla and say that the autopilot should have continued to engage the brakes even though the manual says that they won't.
And maybe he still is found liable for the death of the lady, but maybe it would also enact some type of regulation to begin or at least conversations. Some sort of guidance for how self-driving vehicles need to work with the people.
Just like in web development, you have to assume that the user is a complete moron or extremely nefarious. Because we get what we see here all the time. Abuse of the system, and then surprise that the system was abused
youtube
AI Harm Incident
2025-08-15T18:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxRdt8eSjcTKNdWyyx4AaABAg.ALrAbJ4j4wVALrcj2AxL1f","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwb5k1eMv_82I-tl_l4AaABAg.ALrADne6NrfALrD7d3y5jO","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzJsY4YTX7KWUG4PNx4AaABAg.ALrA7VzY9KtALrDLHUDgp3","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgwNxMoDnIf_9Gz9ikp4AaABAg.ALrA613Kr6yALrDINOHug1","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwNxMoDnIf_9Gz9ikp4AaABAg.ALrA613Kr6yALrDk96GVum","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwNxMoDnIf_9Gz9ikp4AaABAg.ALrA613Kr6yALrDsvehpaD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzdh57IzbgdPKs9dH14AaABAg.AN6ZNn-HbAnANHUZAInvPM","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwo_TMX1Mr-_4_AlcR4AaABAg.ALGrKzsHBWbALNungaqZSL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwG7kGHvtz0_-9I7qB4AaABAg.AL5WfVDLZnkAMFywwL8bkk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwG7kGHvtz0_-9I7qB4AaABAg.AL5WfVDLZnkAMIJlaFSltL","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]