Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
However you're wrong about AI, it's not that simple to make the right choices. A…
ytc_UgwfEHIKs…
G
We are becoming more and more dependent on AI, and the people who own AI compani…
ytc_UgwY1ThEv…
G
Ok but have you ever been on character AI? I had a heartfelt conversation with w…
ytc_Ugw88bmMw…
G
Hey @abundantj9587, thank you for your comment! Imagine getting serenaded by a s…
ytr_Ugx-aQ8CX…
G
I can tell 99% of you don’t know anything in law enforcement or criminal justice…
ytc_UgxO95fPB…
G
and then she realized that she is the problem of the AI empire right now, now wh…
ytc_UgxJcSVjV…
G
Great. Yet another Silicon Valley grifter, but this one's been given the keys t…
ytc_UgxOF4Xw7…
G
@user-pu6fj1xf8h Thanks for the hilarious comment! 😅 That robot is definitely ou…
ytr_Ugwnt4CeM…
Comment
So, the driver floored the accelerator within 30 seconds of the crash, he wasn't looking at the road, he was rummaging around on the floor, and he ignored the warnings provided by the vehicle when you activate the automatic steering on your profile (and any time you take your hands off the wheel) saying that auto-steering must always be supervised by the driver, and the driver must be prepared to take over at any time. We have a Telsa and these warnings are clear; you must pay attention. These aren't just in the manual, they are displayed on the screen before you are allowed to activate the automatic driving mode. I find the argument that there wasn't sufficient warning strange given our experience.
youtube
AI Harm Incident
2025-08-15T19:1…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz05N2k2HstAfTObnx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiqWl2KCLwiQ-LuHx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-vVh8EgGFGv9hPXp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBgr5InhwMYeiq5CF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTq7JuerURsRbTaup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxC_zUpmJISuupmWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMGVyHLrEuoXa4Ffl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8fhtkzxtBvuNqo0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1OXVGATHuqOwrerV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgysO5ZhCSRvNMyFEuF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]