Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They have the audacity to call themselves artists when they probably never picke…
ytc_Ugxa6phNW…
G
People are not horses. Horses were left to die. Art is not to bring you from poi…
ytr_UgwfzHm9m…
G
It's stuff like this that makes me think UBI will be an inevitability. Sure, we …
ytc_UgxB91QaK…
G
"proof that anti ai people are homicidal maniacs" of course he thinks an empty t…
ytc_UgwaLcI4F…
G
seeing all the examples in this video of comments and emails especially at 10:01…
ytc_Ugz68qg9T…
G
If we trap 🪤 the AI robots 🤖 with a nuke it would help but the nuke would be in …
ytc_Ugx83rxdb…
G
@prestonrobert2625 All enforcers(LEO) in our country are trained to hate the p…
ytr_UgxGVeypu…
G
I have neverrrr been able to learn online. This is something that would work aga…
ytc_Ugz_1v9V-…
Comment
I've gone over this very topic several times in the last few years and have come to the same conclusion each time. The only ethical way of handling accidents with self-driving cars is for each of them to prioritize the lives of people *outside* of the car. That means they will always choose to put passengers in danger before putting outsiders in danger. This way, it's the passenger's conscious decision to trust their life to a car the moment they enter it.
youtube
AI Harm Incident
2015-12-08T19:2…
♥ 323
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]