Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> it will never be able to infer something beyond the dataset it was trained …
rdc_mjut7id
G
AI exposes what a useless species humans have evolved to be.
Cant even beat AI …
ytc_UgzfCXo6_…
G
I'm sorry: ...was it Israelis who were a part of the democratically elected gove…
ytc_UgxygVxJU…
G
@Vision33r The Tesla Optimus robot is already being developed now just give it a…
ytr_UgzFyqCtD…
G
"I was just caught with the stein, chatgpt make me a speech that will absolve me…
ytc_UgzzCSKBS…
G
All misdoings conducted by the AI must be the responsibility of the AI's owner, …
ytc_UgwURGdLU…
G
Well, the only reason any of us value art, or literature, or community, is becau…
ytc_Ugye8OqqT…
G
The reality is that AI is replacing teachers and students. In the coming decades…
ytc_Ugzcu0uE8…
Comment
To say the car has to make an ethical choice and must hit another vehicle is a false premise based argument. The situation is far more dynamic that this. Decisions can be made thousands of times a second. And to say the car couldn't stop in time is saying the car is following too closely already. Something a self-driving car is *not* going to do. This is a made up scenario whose likelihood of occurring is so remote as to remove itself from consideration.
youtube
AI Harm Incident
2015-12-21T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]