Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not AI that's taking these jobs yet, AT&T is giving these jobs to India and…
ytc_Ugw8gKr7t…
G
Right on with the K-12 education component. That's the first line of defense to …
ytc_UgxAURW7T…
G
They don't think the art is bad. They think that how it's made is based. It stea…
ytr_UgyVW8iQ3…
G
I can see why you might think that! Sophia definitely has a unique way of expres…
ytr_UgwtpA28A…
G
Company took art. Trained machine to make more art. Artists weren't paid to trai…
ytc_Ugy87ndYp…
G
I don't think deepfakes are the issue here. Sounds like Korea has an epidemic of…
ytc_Ugwwr-bmi…
G
Thank you for this interview. I have been reading and going deeper into the AI t…
ytc_UgwlNLTSu…
G
@mayanksharma4651nah even today ai is more of a tool to enhance programming. It…
ytr_UgxI97afG…
Comment
I see two possible reasons why a self-driving car would have a pre-programmed decision for such a scenario:
1) A programmer anticipated the car being in just that situation and programmed in a rule to take care of the outcome rather than coming up with a rule to avoid getting into that situation in the first place - like "don't tailgate when you don't have room to swerve".
2) No programmer anticipated that exact scenario, but there are sufficiently broad rules for scenarios that are close enough that they can be applied in order to get a decision - which may or may not be a good one.
In the first case, that programmer is responsible for the outcome of the situation (though anyone who imposed additional constraints on his work that prevented a solution that avoided the scenario in the first place bears their own share of responsibility). In the second case, barring negligence, the programmer is not responsible for the outcome.
In scenarios where a self-driving car is actually boxed in, a large part of the problem is the non-self-driving vehicles doing the boxing - it's very easy to design self-driving cars with "flocking" behaviour that would allow them to avoid a collision provided there's a way to avoid the collision if enough vehicles coordinated their movements - the reaction time on these things is short enough, and the individual decisions simple enough, that each vehicle can act autonomously and the net effect be as though they were co-ordinated by a single processor rather than simply communicating by their motion. So swerve toward the SUV, which will swerve or accelerate away from you (meanwhile, the vehicles behind brake sharply)...
youtube
AI Harm Incident
2016-01-16T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]