Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That ironically never happens for some reason. I guess the perk of being friend/…
ytr_UgwJIAJN6…
G
Calling yourself an artist while using ai is like printing a picture from the in…
ytc_UgxFXHEMj…
G
Funny how he used the same debating style against the AI he does against his fel…
ytc_UgweOkqvE…
G
It's only a matter of time when there will be robots with AI fighting in wars. T…
ytc_Ugz1KhbsY…
G
Reddit has always been an AI training tool. That's why the symbol is a robot...…
rdc_kr8mzc8
G
EN MUCHOS GOBIERNOS DEL MUNDO...YA HAY DE ESTOS... REEMPLAZANDO A HUMANOS... DE …
ytc_Ugxv4ynE8…
G
Honestly, I dont care if ai becomes maim stream, cause all it means is that arti…
ytc_UgxqFLPvN…
G
5:27 😂😂😂 gimme a break thats just for attention grab
In no way a dump will have …
ytc_UgzW3Zxh7…
Comment
This video really bugs me because if something happened in front of the self-driving car i.e. large objects falling off of a truck, the first reaction of the car SHOULD be to brake until completely stopped. NOT swerve into traffic on either lane next to you. Swerving into traffic is an IMPULSIVE reaction made by humans. "Ethical dillema" my butt.
youtube
AI Harm Incident
2016-03-18T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggLgMjOnAq3engCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghyqqDTlrLf9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjyLWph_MtItXgCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugigy3nbNEhlSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghcuF6gJA-fpHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTLCUkXByJc3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggLICqx-XT7aHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiesN3Zk63rRHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRuKELFIGsrXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjtG81Si3yyjHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]