Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even the robot doesn’t place his hands on the trigger until he’s ready to fire. …
ytc_Ugzz8enep…
G
It's amazing, the first advertisement came up with AI chat gtp. It's actually di…
ytc_UgymVqAeM…
G
I mean if history is any meaurement of human behaviour, i dont think the AI gun…
ytc_UgxSY4WVI…
G
The statement at the end from the spokesperson reads just like it was ChatGPT ge…
ytc_UgzGqHMzA…
G
All bullshit ai can be stopped or slowed down humans made it , it taking over/ta…
ytc_UgwxHA5NP…
G
The only jobs that are going to be relatively safe for the next 10 years are tra…
ytc_UgzO-wXnu…
G
I don't think it is that simple. To me it is analogous to a young artist learnin…
ytr_UgyL8apoP…
G
In general, I have nothing against machines taking over repetitive and/or danger…
ytc_Ugz4VtPvy…
Comment
I think this accident is good for self-driving techniques. The video shows that human drivers also can't avoid that, because it's too dark. I think the car may need more cameras which are sensitive to the light in different wave length, then training those data in the deep learning algorithm. The government should also improve the power of the streetlight.
youtube
AI Harm Incident
2018-03-23T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyXwumYFEN8fJ3ENt94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7rctvLE5Sj191MXh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTLEzI6_gVBXJTf7d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgybSOKV_omEYjzFyYB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyptGxQ6brB-gxOMoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxpmM9Fk5rQzS-xnAx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQuN1J4VFRSGgoWoN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx0LfjoVulLMEfdRqt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx1-RYHmA-suCLWISJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXtEywrdcUcPbSJz54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]