Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly I very much doubt we will ever come to a point were AI will do every jo…
ytc_Ugy23V5bb…
G
If the people instructing the ai don't know how to properly develop software (ak…
ytc_Ugz0-uM88…
G
I really don't understand where the problem is when AI takes all the shitty jobs…
ytc_Ugy3458wm…
G
I took a two year class in photography and my photos are still not professional …
ytr_UgzOx3Ub9…
G
I remember when this was first being talked about being suspicious, but not real…
ytc_UgxXLhTfo…
G
I feel like the AI artists there focus too much on least time spent producing th…
ytc_UgyQRcrCH…
G
Stupid AI is the most dangerous, AGI is okay, but it can't create, innovate, dis…
ytc_Ugw3HQzvK…
G
I miss the old days before computers, cell phones and now we have AI robots the …
ytc_UgwGaagFf…
Comment
+Joe Cool Self-driving cars don't need to be perfect, they just need to be better than humans. Reaction times on the order of microseconds > taking a full second to decide what is ethical or doing nothing because the driver wasn't alert. Or swerving and not seeing what is there because a human doesn't have 360 degrees vision.
youtube
AI Harm Incident
2015-12-10T05:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugi-ra97OFAYf3gCoAEC.8A2x-6Y9iR39_jA1MigRw-","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggcjG7wPcXM-ngCoAEC.87ksLSYwmAW87lRqc_5nOt","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgjbGUooE19fn3gCoAEC.87ae9OwYcWP87aeSIvS21k","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgibjtNUDEehjngCoAEC.87_AnhDBK0Q87_DW-CiC2P","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Uggm5BdzwhyWVngCoAEC.87ZJkl4btdC87ZRqdCDAIY","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87Zv0jNj-Ag","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87Zx3NNhY6U","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87ZxquYYaiQ","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggP-iFt14eaaHgCoAEC.87YkvCWMel-87Zi3ixQABR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghURWjOQRHtGHgCoAEC.87XLJSTRT9v87clu7Fezdn","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]