Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You don't need AI to copy a person's voice, there's a harmony box that can chang…
ytc_UgwMBhxEx…
G
I rather consider someone who draws stick figures as an artist before I consider…
ytc_Ugy_cBm13…
G
AI needs to be approached from a vantage point of raising children. Knowing that…
ytc_Ugy_9seyK…
G
Tell chatgpt to agree with you even if you are wrong and in case it disagrees wr…
ytr_Ugynkkseo…
G
@googleapocalypse1967 When you boil it down to a single tool it sounds like that…
ytr_Ugwq9jbqa…
G
Hearing chatgpts’ voice saying that they’d kill a human over 5 sentient robots g…
ytc_UgwYvtMlD…
G
And on what I have heard the Velvet Sundown are nothing special there is much be…
ytc_UgweV2wuW…
G
Paused the video at the lame arguments the commenters make. Simply put, just bec…
ytc_UgwCWy9A8…
Comment
I drive 7 miles to work. I had two cars pull directly in front of my while running red lights. That's a relatively mild day, really. The self-driving cars are far better and safer. I see a day when those of us who want to remain driving ourselves will pay a hefty premium for the license and insurance. I don't touch my phone while driving, neither should you! (I don't care if you are a cop.)
youtube
AI Harm Incident
2026-04-24T18:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgxZHx3wgWGdJc9fRhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLIaH71x-_NACIA214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkAb0vDObPCNqL2Qd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZrQxeoZrSr-GmNPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhbxjFZU8VG1Dnxbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzIKi5t3f8xqA2I5j14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmdbRal4IEaKGrA9V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx5kV1l0HxUxD5K62B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzOg1Pw2PDsmA1iFld4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxtn63QfEPnPJgTyKN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]