Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They never said they hate ai as a whole, they said they are preventing ai from t…
ytr_UgyD0PZ5g…
G
It’s bad enough that we are giving AI a female persona, including an imitation b…
ytc_Ugy-qeTwT…
G
Got an ad about how you can’t spell “Anything” without “ai”
The irony is steel …
ytc_UgyZrB6gi…
G
People likes the easy way, so if ai provides it people wlll stop learnimg and st…
ytc_Ugx-JBq7d…
G
I feel sorry for this family, but only children that are severely depressed comm…
ytc_UgzdZ2ihY…
G
@NotTheEnd7766AI research is killing people RIGHT NOW. It's dirtying their ai…
ytr_UgwbER0RF…
G
1- he is not an artist at all
2- nobody destroyed him, ppl just gave him a lot …
ytc_UgwsxFK6r…
G
@ right, I’m not saying that it’s a good thing either, just poking at those “AI …
ytr_UgzkwSlo0…
Comment
The self-driving car should always stop before hitting someone else.
If the car can't stop in time, it was poorly programmed and should be reprogrammed to increase following distance. If a jerk cuts off the self-driving car and spills something and collision is unavoidable, hit the irresponsible driver who instigated the accident by producing a problem that could not be resolved without incident. It's fair and unbiased.
youtube
AI Harm Incident
2015-12-28T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]