Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't see how anyone finds AI art. Art is made from the imagination of a human…
ytc_UgxOvBSuS…
G
"AI" is just an algorithm, it doen't think. it doesn't understand. It cannot cat…
ytc_UgypzA3Sd…
G
Your summation in the epilogue threw me through a loop, all the sudden questioni…
ytc_Ugww88gxC…
G
I just got a midjourney ad and a ai assistant ad while watching this.Coincidance…
ytc_Ugw76bVJ0…
G
2:58 Well, if that is the case then then I’d say Sam is completely wrong and A.I…
ytc_UgyFfeLoF…
G
If you see a video of Sam Altman saying something responsible it is definitely a…
ytc_Ugwac6Z2C…
G
Let the college and universities die! We stay in a cave together. 😂 The fees are…
ytc_Ugz1hy28P…
G
Last time I looked robots were slower than humans and couldn't move like this.
I…
ytr_UgwkMZ6gh…
Comment
1. what distance did the self driving car allow between the truck and vehicle?
2. what speed was one traveling.
self driving cars will be programed to not tail gate.
object falling from moving vehicles still have moment forward.
with enough distance between truck and self driving vehicle no harm will be done.
youtube
AI Harm Incident
2015-12-26T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]