Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As long as you use AI for good purposes or make art and KEEP IT TO YOURSELF ONLY…
ytc_Ugz0oVwt_…
G
We are becoming increasingly dependent on IT, computers, internet. AI is born wi…
ytc_UgzVQxgYg…
G
It sounds to me that this guys job was to literally put AI through "diversity" t…
ytr_Ugzka9Rms…
G
“Why are YOU of all people freaking out about AI?”
1. I’m an aspiring author.
2…
ytc_Ugzb7epb7…
G
Google is LITERALLY punishing sites that use AI and limiting them on search resu…
rdc_l9yzpth
G
lol yeah good one,, AI is a tool for HUMANITY by humanity,,, it can never replac…
ytc_UgyU_vVPx…
G
What a sick world. AI...? Please call it for what it is MG Money Greed as per u…
ytc_UgzPja0NR…
G
@wakeoflove it's not a case of gatekeeping anything. U can't expect to learn so…
ytr_UgxZKb05h…
Comment
If these type of self driving cars end up being the majority of cars on the roads, and they are all programmed the same way and are told to avoid crashing into a certain type of vehicle, then this will encourage the public to buy the type of vehicle that these programs try to avoid crashing into. And these vehicles will be the ones that are inherently less likely to survive a crash, which is precisely why the program tries avoid crashing into them. Therefore, by introducing these self driving cars, we are encouraging people to buy cars that are inherently more dangerous to drive.
youtube
AI Harm Incident
2015-12-08T16:4…
♥ 41
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggmIyJ8SloWNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTisOhXvg2MXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggJ7uf4xwzHrHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughd4nDqmE0otngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghTs3eIZEp4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiT1_uxg4Qf93gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiiYSCGtUOQQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughs2ea7-kE5XHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj_gIAyUkWWl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggO5i8Su4Fd-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]