Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even for people who are just using AI for fun, it still has an extremely bad aff…
ytr_UgxUv8nqY…
G
What AI weapons have to do with climate change? ??? Climate change is a big lie…
ytc_UgzV9Vu_x…
G
nah, Indonesia doesnt that cooked, i work as iot infra engineer, even almost all…
ytc_Ugz4Peqmt…
G
He purposely lied about this he was told many times from people in silicon valle…
ytc_UgwNTA6-V…
G
Honestly, this whole scenario is pretty unrealistic. Think about it — no matter …
ytc_UgwcMKwjp…
G
I see absolutely no problem in using the work of an artist to generate more piec…
ytc_Ugz5byaaG…
G
Logitech didn't "build AI into their drivers" it's literally just a hotkey to la…
ytc_Ugy9ld5x8…
G
If AI's actually more effective than human CEOs, it's going to hire people for s…
ytc_UgxQiItCD…
Comment
If every car on the road would be self-driving and intelligent, I think these scenarios would be quite rare. However, it's difficult to produce an answer if faced with these ethical dilemmas. Rather, we should spend more of our efforts in preventing them.
Still, Murphy's law is a bitch.
youtube
AI Harm Incident
2015-12-08T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]