Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
really don't get how this is possible. every time I tried using anything AI rela…
ytc_Ugxj3W3Ac…
G
The current centralized AI architecture supposedly aligns its interest with its …
ytc_Ugx5s1f7I…
G
A computer programmer has chosen the Homeless-Outback-Aussie fashion and is now …
ytc_Ugw4BIxZf…
G
A year later, and you can still get jobs as a software engineer or other IT jobs…
ytc_UgwDEtcfu…
G
If a person is fired due to AI they should get some extra money on top of the re…
ytc_UgwsxmLc2…
G
@thewannabecritic7490 The Ai pics that copied her character and style, def look…
ytr_UgxEWeEoW…
G
My wife makes fun of me because I’ll straddle the line between functional/dysfun…
rdc_mnivpms
G
Even in arguments about other types of ai that arent generative, disabled people…
ytc_UgyKdQOnF…
Comment
If the Wall Street Journal was serious about reporting, it would put tesla crashes in context. How does Tesla autopilot compare with other systems? How does it compare to human driving? Should we forgive an inattentive driver being reminded to keep their eyes on the road more than ten times who then winds up dying because the car crashes? Would we do the same for someone using regular cruise control? Obviously not. Each of these drivers had the opportunity to hit the brakes at any time, but WSJ doesn't want to ask why they didn't. There is a person behind the wheel for a reason.
youtube
AI Harm Incident
2024-12-15T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJ80oBIPeLD-kYx9Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwSmUzziEtW1d7G_-x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2kBh5IoHdK_mp_oR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhyFMaTQlSTc9vxZl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyxVBOGzljk0jRHXwR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvwWK7QO4EQ0Dob4d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKyG1ElLD8RAdWsyl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxzWJieAZDpBO2DAPJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxpyp4o3ZeKSDaFx6l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwsNiAK-S6WzFDntdh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"}
]