Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There may be a missing component in the autopilot systems that still need to be …
ytc_UgykCCSFq…
G
22:12 I know I'm a bit late to the party (commenting 6 months afterwards) but wh…
ytc_UgzXxLL6O…
G
"Gateway" [Frederick Pohl - 1977], he was way ahead of the curve with an AI for …
ytc_Ugy1okNUD…
G
Tesla will win they can scale up by the Millions
Where Waymo can't.
AI Is Prog…
ytc_UgxdsPpve…
G
I'm a customer service rep in the Telecom industry except I communicate to you b…
ytc_UgwTQJTpF…
G
Why do you need robot friends when you have human friends ... They are idiots wh…
ytc_Ugwl56F0f…
G
People eating up Ai without any thought of consequence. Can anyone think of anot…
ytc_UgwIgy3wx…
G
A.I. is like giving a live hand grenade to a child then running off.Humanity bei…
ytc_UgwVrlLu6…
Comment
Your question reminds me a bit of the recent ethical dilemma being faced by manufacturer's of self driving vehicles. In certain unavoidable accident scenarios, the vehicle may to have make the decision to potentially hurt people in order to avoid hurting a greater number of people. It may even have to sacrifice its passenger.
youtube
AI Moral Status
2017-03-12T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjLYJhHPMsUEHgCoAEC.8PtuUTIQEvX8PwSVSJcIhI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PsaiuqXSnb","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PtiwpvP6H8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugi318Nm44FAC3gCoAEC.8PqhuLdkTCr8QUrkX658J7","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8Pq9acEMmtW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8PqvLGRI204","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugh4-V51At2SPngCoAEC.8PouRlAd_Sn8Q_7dKAoD2q","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgggUPfJ9n2pw3gCoAEC.8PnRiYml4Pl8Q03__JyheV","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Uggczad5RakHtngCoAEC.8P_l9quOfj68PacjLfPG-g","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgirsstFTRgqcHgCoAEC.8PWp2MxMP0z8PWqqddcvWJ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]