Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think its just trained to do that because AI has a tendency to kind of loop re…
ytr_Ugyris3Mh…
G
27:39 How can it bring down the cost when the robot fighters cost so much to man…
ytc_Ugye50rRW…
G
In the next 5 years AR glasses enhanced with AI are going to displace all these …
ytc_UgxrK8QL6…
G
0:32 Do you even understand what ux design really is 😂😂. How the hell are compan…
ytc_UgzKvowgY…
G
11:15 one of the biggest problems is that other countries who are competing with…
ytc_UgzphKjKj…
G
When non-developers ask me whether I (a developer) am afraid that an AI will tak…
ytc_UgwE26t8H…
G
1:08:58 did they seriously asked this question to someone who invented Knowledge…
ytc_Ugzv7FBtt…
G
But you are jealous. Writing a ton about and making videos of other people using…
ytr_Ugyb9NQHo…
Comment
We get so caught up making these highly improbable hypotheticals that we forget that human drivers _would also struggle_ in the exact situation, not just a computer. Therefore, this can't be used as an argument against self-driving cars in particular. Although this video partially recognizes that towards the end, more emphasis should be placed on reminding the viewer that these cars statistically kill fewer people. Besides, the hidden option here would be to apply the brakes. If someone tailgates you, that's their fault
youtube
AI Harm Incident
2022-06-08T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]