Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
well we only got to give them enough intelligence to do a specific job we dont n…
ytc_Ugz8pO1E1…
G
Just going to add that they are thinking of Ai governments . Look it up…
ytr_Ugwk8QPLX…
G
DON'T TRUST WHAT ANY MAN MADE ROBOT SAYS! THIS ROBOT especially is Lying out i…
ytc_UgzPjVhye…
G
I think there needs to be some serious regulation around AI and human art. Like…
ytc_UgxZLzyT1…
G
In the universe of stupid ideas, EV's are super Novas. Add self driving and its …
ytc_UgzfaqVQa…
G
Okay, but driverless cars/trucks are gonna be way safer and efficient. We need t…
ytc_UgwiA7TtU…
G
Algorithms for facial recognition are written by teams that are overwhelmingly E…
ytr_UgyI45Fzw…
G
There's nothing you or anyone can do to stop AI progression. Why? Because it's m…
ytc_UgyRnlrU2…
Comment
The answer is that the vehicle should reduce as much harm as possible. So if you have 3 options, hit the person on the left/Right or keep driving forward. The answer is to keep driving forward or Brake. Whether or not breaking will save the passengers it WILL reduce damage caused. This way no-one other than the passengers of the self driving vehicle are at risk.
youtube
AI Harm Incident
2015-12-11T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjwkh7gbtadm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghlKJ8Nc_INgHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghjkWiCvWeo1ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghilDXtRwfSCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjN2KgJTlwlC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh54ZJdEXZfwngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjWA5kpI1F_UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjB5N2AWV6PlHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh2zj0x13RnS3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiaFVeDpzC9U3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]