Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I talk to AI. Most people do, these days. Not in love with one though. I find th…
ytc_Ugw6BEIDU…
G
Now I know what I want for Xmas! I wonder how much she costs. She looks so real.…
ytc_UgzGsLptN…
G
Anthropomorphic robots are bs. However AI used to exploit people without them re…
ytc_UgyBpVJIS…
G
We’re limited by our physical ability to communicate but some folks are already …
ytc_UgzOqhL7v…
G
it will replace developers, but by then it will also replace pretty much all job…
ytc_UgzM4wY0L…
G
@flippidlyflipi have no clue who Sam Altman is, but no????? What relevance does…
ytr_UgzTQhXiJ…
G
For more Refer to this - https://en.wikipedia.org/wiki/P(doom) The opinion of…
ytc_UgzhlDX1c…
G
I know this person isn't AI becuase there's random bannanas manifesting in the b…
ytc_Ugz2nJv9-…
Comment
It's not going to end tesla. This was a case from 2019. Current iterations make it clear Autopilot is not full self driving and will not stop at stop signs and require constant attention to the road. Though I do think that FSD should be engaged in dangerous situations even if you don't pay for it. After all, if the car has the ability to save your life and doesn't because you aren't paying a subscription, that is a huge flaw.
youtube
AI Harm Incident
2025-08-15T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz05N2k2HstAfTObnx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiqWl2KCLwiQ-LuHx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-vVh8EgGFGv9hPXp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBgr5InhwMYeiq5CF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTq7JuerURsRbTaup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxC_zUpmJISuupmWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMGVyHLrEuoXa4Ffl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8fhtkzxtBvuNqo0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1OXVGATHuqOwrerV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgysO5ZhCSRvNMyFEuF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]