Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WTF. Insofar as this benefits humanity, the Earth and all its ecosystems without…
ytc_UgywoAITk…
G
When the ai's see this they are going to launch the missiles. We're doing Humani…
ytc_Ugx5_jSVY…
G
I think it makes more sense that this is just hype. The thing is, AI doesn't do …
ytc_UgxHUDyV5…
G
Yep that made me almost fall of my chair ha ha, what an idiot, of course we do u…
ytr_UgzuU0OpW…
G
The reason I'm not going to take your word for it is because you are wrong. Gene…
ytc_UgzLQK2BU…
G
Exactly ! Although, I would rewrite your second sentence as "This is the reason …
ytr_UgzLYsffO…
G
@xxxprogamerxxx5909 while I get your point, I still think the ethics haven’t cau…
ytr_Ugym2S2av…
G
I gave a polite and well thought out argument against ai where i talked about ho…
ytc_UgyEq8jP7…
Comment
My question of the auto pilot programmers is this . At some point the auto pilot will have to make a decision to kill one human to save 3 humans .example a Tesla is happily driving down the road on auto pilot when a car comes down the wrong side of the road the only way to avoid a head on collision is to run over a pedestrian looking at her phone . Does the AI run over the pedestrian to avoid a head on collision that could kill the occupants in the Tesla.
Thank you I'll wait for my reply
youtube
AI Harm Incident
2022-05-21T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx-J71CMB9VJtmNS3F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS5PnDTuE4Qt_ZtxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzB3mrdBffosdTBnq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzx32_wxeoG6zs8uyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgweRYOzCRtOgT2b4FZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmSyXHj9outwzC9Gh4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxuKC6Bjt-HvbFoof54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-ZoTbk0SeMEzYwuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHv5h2iHbjfo8wVZ54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxC0kM-rB5_RCSh2094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]