Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The A.I doesn't "create" nor has any skills whatsoever or understanding of how a…
ytc_Ugw9c2uqD…
G
I used that Smyrim AI mod for a week or two at one point, and despite them being…
ytc_UgwnXtGF9…
G
There are very serious issues that few with decision-making power have thought t…
ytc_UgwC5KmWZ…
G
Intern engineer said his older brother is a full stack UX software engineer. He …
ytc_UgyIl39mm…
G
fuck you, ai devs. know that you are a destructive force for this world and huma…
ytc_Ugz61dILx…
G
People leaving jobs are often those whose roles no longer align with a company’s…
ytr_UgyK2LNrM…
G
> but there is serious pushback now from generation z on these questions
Pus…
rdc_fen3u5y
G
if ChatGPT weren't designed to be "polite", they will be too straight forward an…
ytc_UgxB2Z2IL…
Comment
I hope this legal theory doesn't jump to other things... I mean.. it kind of already has, and society usually hates it.
The safety gas can that leaves 5% unable to come out. Playgrounds kids get bored with after age 5. Passengers can't pair their phone to a moving vehicle. Half of all products sold warn you that California thinks it could give you cancer.
The list goes on.
I love that Elon took a loss but if this scares car manufacturers, they will stop innovating.
I personally don't do 4+ hour road trips anymore because I find them miserable for the driver.
If we don't have self driving cars in 30 years, it's not because we couldn't do it, it'll be because we won't let it happen. 😢
youtube
AI Harm Incident
2025-08-16T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]