Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When comparing AI with humans and by that setting the gold standard of "Intellig…
ytc_Ugz0HuYIr…
G
I don’t even like copilot interference in word and excel so I disabled that anno…
ytc_UgzvmGDgq…
G
The OneyNG AI sounds like Adam and the Squidward AI sounds more like Squilliam F…
ytc_Ugxa6tRYU…
G
@clementinelives it was generally just AI, and I was angry at tiktok for promoti…
ytr_Ugw66GtE5…
G
There is a video out there that says what ai needs to have, to be coscious
1. Ha…
ytc_UgwZUzQle…
G
I spend part of the year in Hawaii and part on the mainland. I fly often. I was …
ytc_UgwZk3ubE…
G
Attending here, imagine how well med students would do on step exams with open i…
rdc_jkq964i
G
Will the Data centers catch our corrupt government
Will the ai be trained to be …
ytc_UgwnaWvrj…
Comment
I have one problem with this scenario that I haven't noticed in the comments yet. The second scenario with two bikers, 2:04, they never examine the obvious third option that was in the first scenario. Let the car attempt to slow down as it hits the falling object. Instead the narrator chooses to ignore this and suggest to the audience that our only choices are to kill others by hitting the person with the helmet or the one without to try to make a point.
Sure, this doesn't completely invalidate the scenario. But it kinda betrays the narrator's own bias and determination (and suddenly insidious goal) to make us leery of self driving cars.
youtube
AI Harm Incident
2015-12-09T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggTAra7ykO18HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiiIRzPV-PDJngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghNHFfbScHAI3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggs6xSxQV1idHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh555atHjwB23gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGZiL-RQWZh3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj_T2kb-3J5iHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgglL4SDgYq70ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQrXYx4XEWV3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Uggozw99vhiuyngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]