Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In my opinion, Shad is a rather mediocre artist who tries to get away with his v…
ytc_UgxNe5s1M…
G
I don't think it's you. I bet you anything this company doesn't have the money t…
rdc_ohw57um
G
Some leave because they think LLMs are a dead end and some because they think it…
ytc_Ugx_Pnjcg…
G
>> Here’s the summary for those who don’t want to go through the whole video
T…
ytc_Ugx-jsCX6…
G
I've seen classmates rely too much on AI, but Olovka's note-taking features help…
ytc_UgwEP2yGq…
G
For AI to 'get rid of people' would be similar to a man becoming dictator by kil…
ytc_UgxRcEmO2…
G
People on here thinking that this is real are stupid.
AI manipulated.
People …
ytc_Ugwc49nb-…
G
Lol I was looking for a Cyberpunk mention.
Would've been hilarious if the Waym…
ytr_UgyVWLk3G…
Comment
While self driving car should have stopped, safety driver should have been paying attention and stopped, for me the blame is 100% with the pedestrian. Jaywalking and not looking for oncoming traffic is incredibly stupid. I would be concerned about any attempt to make drivers "more responsible" in Jaywalking scenario, as pedestrians will have no incentive to pay attention crossing the street. Even if you are crossing legally and you don't look for oncoming traffic, it is still pedestrians fault. You might be right, but you'd be dead right. Don't mess with oncoming traffic, pay attention!!!!!!!
youtube
AI Harm Incident
2018-03-23T12:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxy01VX_8QwXy9_57V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP2TOIEQNrkwHotDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwjt5cv4iPRLp6BKrF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyUUkGKDz2ID-KRa1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSzbckQWJHdkyLy0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkBaLtQi5J43dOVjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz0hN4KTF0haVONTDJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvRS-7NO2QZfGHuJ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2x7umeeGIBkOO7Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQLYrOrlprqGY7tD94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]