Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find it funny and also sad that Peter Joseph talked about technological unempl…
ytc_UgxUa_0rv…
G
I agree with Hayumi Azaki, it just proves how lazy these A.I companies are.
How …
ytc_Ugw5G3YGH…
G
This guy just like, yeah i knew i was inventing something more dangerous than th…
ytc_Ugz8WV1Hw…
G
My favourite insight is how ChatGPT once told me that soil quality has a direct …
rdc_m2fvla7
G
As an artist, I have no problem with AI as long as you label it as such. Everyon…
ytc_Ugwl-hF1D…
G
This video is in one way hyperoptimistic - on the development and advancement of…
ytc_UgzpilGjq…
G
@wlatol6512 it kinda is, I have experience with ai and the negative and positiv…
ytr_UgwO6SFiY…
G
My heart is already wide open for them. I don’t wanna date them but u can feel t…
ytc_Ugx8GZI8I…
Comment
Hit the safest vehicle. If you are in a self driving car if it crashes into another car both cars are most likely safe. If human safety on either side can not be guaranteed then hit the object in front of you.
youtube
AI Harm Incident
2015-12-27T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]