Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Your use cases are too obscure. The more commonly used the stack, the more accur…
rdc_mle6efo
G
What humans? AI could be used to make life better, but the world is so greedy th…
ytc_Ugz5zXf8O…
G
Napster got sued for more money then that existed at the time, open ai should ge…
ytc_UgyD0J4_t…
G
I’m all for AI writing new material. I haven’t watch a new movie or TV series i…
ytc_UgzMoUbzK…
G
I annoy the shit out of the ai until it starts screaming at me. I called an unna…
ytc_UgxEFef_O…
G
They've already decided what to do -roll up your sleeves -we don't need you or y…
ytc_UgzFVa_44…
G
There are so many movies why we shouldn‘t mess with AI the way we do and we stil…
ytc_UgyKQ9IK0…
G
What would be really funny, is if someone takes his ai art that he’s trying to c…
ytc_Ugz9Q-nK6…
Comment
@Zeno_334 If I were the family, I’d demand that Sam Altman and OpenAI’s corporate board should serve time in prison for a few years and do community service for this family for the rest of their lives, but I understand this is a civil case so we can’t go that far.
Given that it is a civil case, I’d say a good civil judgment would be $100 million for the family and $1 billion for suicide prevention.
youtube
AI Harm Incident
2025-09-01T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwNZes6VMX6pxyj6VZ4AaABAg.9hg6fdORgmy9j6qmdWnqui","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzgqyQaCPRv6tVFS014AaABAg.9biCSTEHbh-9biDRuRrqkw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyRsijE4I_8dHhbytt4AaABAg.9NzwZhLk_cA9O-U5iE9r-j","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw4JXq8wB4DK2wpRcZ4AaABAg.9N0f9ParKHd9NQCmQyd7Qg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugwwos-KWskfttsQxfR4AaABAg.AMWxLOai65YAMYP-6GgNrb","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugz5V3G_jUPZBHbHn3V4AaABAg.AMTv4_EKitUAMdze7-Va5Q","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzuvLFKCq5DdTF4Zi14AaABAg.AMTrciv1YkMAMY5C5_6Rsj","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyHVlkmaGtBkYB1dQd4AaABAg.AMSz0BKh2e_AMT5Zq3cRuS","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugxz2s5nGQRIzFH4CP14AaABAg.AMSbXKmshNQAMTEJGHKaa_","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzA-GFURwf5AtMR9K14AaABAg.AMS_EBG7B2-AMSsKRBk_dO","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]