Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't care for the interviewer I would have done the robot dance with Sophia. …
ytc_UgwWvlaH0…
G
I find it absurd that people would feel it necessary to threaten you over this. …
ytc_Ugwo5c7KG…
G
Remember!...these robots are 'programmed!'...
what they say!...is likely the sam…
ytc_UgwCeZykj…
G
I really don't think that self-driving cars are the best cars to have because it…
ytc_Ugxr6E9-m…
G
Ai art company’s scrape real art of the internet without permission and feed it …
ytr_UgzIR8zsh…
G
Ill just ask ai to build a new Microsoft and only charge Microsoft to pay for ac…
rdc_oh2r8en
G
Some people think the beast in revelations will be AI. And the neuralink implant…
ytc_Ugwk3IJR4…
G
The moment I saw it I was like
“Nah that’s ai”
Then I proceeded to laugh at all …
ytc_Ugyd-E__W…
Comment
Autopilot doesn’t recognize stop signs. The stop signs are the drivers responsibilities. Now if that was full self driving then it would be partially the cars fault but think it’s purely the drivers fault. It was being misused
But all that said that many strikes in that short of time he should have been locked out of it for a long while.
youtube
AI Harm Incident
2025-08-15T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxrTk6SweySWtZKJqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyUUtDJ5nqYcA_M6694AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgwuNsWEGvAj4pFHnqB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyEGcSuVgQcdSzwwh14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzbBYRrtKs3vrVkScB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxNI0uUcscS2OfpMml4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJPSY0BgLAg3uK7Jl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwbjPEQvd7r0hPUJb94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgynbAv2uA1nwHGHZ0h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxf3iuQuMV4Or-DDuF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]