Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
would having a human that's mind is willing to shear his conchesnes with your ai…
ytc_UgwMPoiKJ…
G
The goal is a post scarcity society where people’s worth isn’t determined by lab…
ytc_UgxxuFu_A…
G
Why? Automated passenger vehicles are by far safer on the road than human driver…
ytr_UgxfXmKe2…
G
Is that book legit I have saw others post about it but said AI was just posting …
ytr_Ugzg4TENP…
G
I asked it to reply anyway:
"It is intelligent—just in a weirder way than peop…
ytr_Ugy7AdZ5Q…
G
18:07 You do drugs. Lots of drugs.
If humanity makes it past the AI filter aliv…
ytc_Ugzzt9_Yw…
G
"The robot arm clamped onto him crushing his chest and face"
Yeah Zack, my day …
ytc_UgzjTCb7l…
G
I think it was the drivers fault, to everyone who believe its the ,,bad modern t…
ytc_UgzKz44xO…
Comment
I bought my Tesla in 2020. I can’t say how things were in 2019, but by the time I had mine there is no way that I would have ever had the thought that the driver assist features were safe enough for me to stop paying attention to the road. It was a couple years before the full self driving mode was activated in my car.
The description of the case makes it sound like he kept his foot on the accelerator. At all times I have owned my car, my foot on the accelerator overrides all the attempts at safety the car may try, I have essentially taken control. (I have not tested what happens if I try to drive directly into a parked vehicle, so my experience is lacking a little).
So, it’s possible Tesla was negligent at the time, but I can’t see this happening to me in my car now without me being majorly at fault. The current version of fully automated driving is pretty awesome.
youtube
AI Harm Incident
2025-08-15T19:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz05N2k2HstAfTObnx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiqWl2KCLwiQ-LuHx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-vVh8EgGFGv9hPXp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBgr5InhwMYeiq5CF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTq7JuerURsRbTaup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxC_zUpmJISuupmWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMGVyHLrEuoXa4Ffl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8fhtkzxtBvuNqo0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1OXVGATHuqOwrerV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgysO5ZhCSRvNMyFEuF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]