Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here is my take on AI, it is great for dealing with the backbreaking gruntwork, …
ytc_Ugwgta0Gd…
G
I predict that 'the readers' as AI calls humans, others than the chosen few, wil…
ytc_UgxAhlLa0…
G
Hi there! In the video, Sophia was asked a thought-provoking question that made …
ytr_UgyeONOwO…
G
>The San Francisco Board of Supervisors on Tuesday enacted the first ban by a…
rdc_enj3hnz
G
I disagree with the AI not taking jobs. AI already controls, monitors, and micr…
ytc_UgyOCcuYR…
G
Aren't the argument corporations are making for AI...communism? As a socialist, …
ytc_UgxOTi8Ll…
G
I love how people think "who's going to buy stuff" and never 'we don't need one …
ytc_UgzH4JuVi…
G
Bernie lays it out in this video, smaller work week, yet same pay. We are alread…
ytr_UgwxUUiVt…
Comment
Surely the self driving car should not be so close to the truck in the first place, therefore it would have a greater distance between the danger to allow more time for deceleration. Also its not as thought the boxes shown in the video would instantly stop either. If they were heavy enough to cause serious damage to the car then they would have more momentum to get rid of and would decelerate slower over a longer time.
youtube
AI Harm Incident
2015-12-15T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]