Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Two dead motorcycles is nothing compared to the number of motorcylists on the ro…
ytc_UgwDD6K8J…
G
14:12 part of that money needs to go to using AI to solve humanities problems wh…
ytr_UgxpzCnzA…
G
People used to look for the right tools to make work more efficient and more pro…
ytc_Ugyo0DmQ5…
G
Glenn, you are a puppet for Trump and Israel. The way you frame his AI infrastru…
ytc_Ugzj2jGS8…
G
This is how A.I. is gonna kill us it dont have to be a big matrix terminator sty…
ytc_UgzRNgxY3…
G
an AI researcher yet still too dumb to pay attention to all the science fiction …
ytc_UgyUvc15f…
G
Give the Chinese credit for more upright but Tesla AI is probably way ahead! Wil…
ytc_UgyAsmlAC…
G
This guy just like, yeah i knew i was inventing something more dangerous than th…
ytc_Ugz8WV1Hw…
Comment
We should follow the rules already put in place. Don't cause harm to other people. If you're the person about to crash into an object and it's unavoidable /unless/ you purposefully run into someone else, that crash is just morally unavoidable. Hit the brakes as hard as you can and brace for impact.
Getting into a self-driving car will always be a risk. No one should sacrifice another just because they were put in an unlucky situation due to a risk they knowingly took. Keeping this "save me first" programming out of the hands of the rich/influential will mean laws that seem difficult to enforce.
youtube
AI Harm Incident
2015-12-08T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugi1caexrD8SbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghiyJc91JDD0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjSNaOTvVqec3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggO6Otzv4JUNHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghVla3dyO4UzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiZ-f7irifHvXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghhlM_s-YcMu3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiQEwrx-Avbs3gCoAEC","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiwHyTQg5h2BngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghMEOjSVojWFXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]