Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>wouldn’t it also imply that AI could automate major parts of what lawyers ge…
rdc_nm8q1g5
G
AI doesn't bother me as long as they don't try to pass it off as real.…
ytc_Ugy0Ii3vb…
G
I'm cheering for the AI. Fuck artists. If I had a dollar for every so-called art…
ytc_UgztLheAh…
G
If only they could make a economy based on developments per gdp like the nazi s …
ytc_UgwXBNKs3…
G
You mentioned sci-fi, and human manipulation by AI, and I've got to say, I've be…
ytc_Ugwp3_F1G…
G
First time I hear Tim say something wrong . AI will control robots . DARPA have…
ytc_UgxzTzFBZ…
G
I am not an artist BUT I STILL HATE AI art I understand the work and dedication …
ytc_Ugzw4Vv7L…
G
This is ridiculous, fear based nonsense. Who is going to buy the stuff that keep…
ytc_Ugwo4THV_…
Comment
The car just drives right into that woman...0 reaction. It's so sad to see that a new technology takes someones life.. even if the accident was her fault, tesla and even the goverment shouldn't let automated cars out on the public road like this, cousing death... Limit the speed for these vehicles, until they learn every little detail of the roads, if there's a little risk or something unusual is happening next to the car: Slow down, alert the driver! If the people want self driving cars make them 100%safe first . This tragedy could have been easily prevented.
youtube
AI Harm Incident
2018-04-14T14:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxu48WTenHCgOgQdwR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLqzF1X1NEeHPH_C54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzv81s398PkyR_p_Wh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMJgjlj3ONO3x1qBp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"},
{"id":"ytc_Ugy3Jvbtv1IIMc_Gb4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwEaj_OvSl6JkcqRht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmsL2y5MiyftdfEfh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxaGE7TUjtUpSAjVd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypSz2b-trrwPj1Wt54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwpojYDdejAB2qyjP54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]