Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most AIs can't recreate exact faces, its an explicit guard rail. You can however…
ytc_UgzdR0oLv…
G
The big beautiful bill (that is now law) specifically prevents all 50 states fro…
ytc_UgyjRi7Ta…
G
AI only looks at things logically, not emotionally or spiritually. If what you’r…
ytr_UgyWUDYBt…
G
"The only connection you have to an AI, is the connection to your own stupidity.…
ytc_UgzlVMHSG…
G
This conversation rubs me the wrong way. It is fear based on lack of understandi…
ytc_UgxRF5zFk…
G
I know exactly what will happen, it's all written out in the bible, Revelation c…
ytc_UgwrXQYLy…
G
It's not an AI problem, it's a problem of unchecked capitalism. You want it to s…
ytc_UgyD7bbhC…
G
When you said all these people or robot one of the robots give you the side eye
…
ytc_Ugx870meI…
Comment
A robot does not care for justice. A robot cares for what its programming says. If the programming says hurt the least people, it wil do that. If the programming says save the driver at all costs, it will do that, though likely at a coin-flip. Don't let the self-driving car make that decision. This is why we need manual override, so the user can make these decisions a robot can never hope to answer.
youtube
AI Harm Incident
2017-07-09T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]