Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am taking those AM ai chats to the grave and you gotta resurrect me to even ha…
ytc_Ugxx_hVZO…
G
@MASKEDB 1. Cute attempt at gaslighting. I know I did not like either of those c…
ytr_UgyF1Ceng…
G
This guy is nuts if he thinks AI will replace 99% of jobs by 2030. And especiall…
ytc_Ugxlo7d07…
G
The danger from AI isn't from AI. The danger from AI is the human manipulation o…
ytc_UgziDw8Lw…
G
"ChatGPT you're beginning to sound like Jordan Peterson with your definition of …
ytc_Ugy_IsNcY…
G
As a cyclist who has been hit by cars on three separate occasions, I say bring o…
ytc_Ugz4hMq_q…
G
So all your videos are AI generated? Slop about slop? That’s clearly an AI voice…
ytc_Ugx9m0PkF…
G
"AI is just a tool-" Not if it makes the entire piece for you. A hammer doesn't …
ytc_UgzgQqKYL…
Comment
The solution is easy... you program in ALL the possible decisions that the car can make in a situation like that and if an accident happens you make it choose *randomly* so that there's no "moral blame" to the programmers. The number of lives that self-driving cars will save *FAR* outweighs these nitpicky moral dilemmas. The fact that you're using a safer device overall that reduces the chances of hurting people justifies the rare crashes that will happen.
youtube
AI Harm Incident
2015-12-16T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]