Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am 1000% on your side and agree with just about every point you make. But it w…
ytc_UgwQMux8h…
G
SVG: “I feel this, I feel that, I don’t want to do dishes anymore; in twenty yea…
ytc_UgwpEO0Zm…
G
I fear that wishing humans were more like an AI companion is the beginning of th…
ytr_UgyKpAojv…
G
Your self driving car was a Jaguar? Pretty fancy! The emblem on the steering w…
ytc_UgziLbbrX…
G
I personally think this is being overblown and overlooked. IF AI does "everythin…
ytc_UgyDN4EuH…
G
Man, I am starting to hate people that think that AGI is coming and it's gonna b…
ytc_UgwoSATOD…
G
Don't worry ai is a double edge sword. It won't take long until companies would …
ytc_UgwX74qes…
G
Read in an article it did stall on route to get passengers. But, stalled due to …
ytr_Ugx4Nym4Y…
Comment
This video made me question my utilitarian philosophy big-time - going well beyond the efficacy of self-driving cars. I realize now I was just generalizing by saying I believe in utilitarianism - saying stuff like "to minimize harm" or "to maximize happiness". But, as is usually the case when it comes to one's personal philosophies/ beliefs (especially if said "one" is twenty two), it turns out life is tricky and many a scenario could be thought up that leads to conflicting answers using the same basic rules.
youtube
AI Harm Incident
2016-10-17T04:3…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgjGy_ree2B0EHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj96NpyN-f2BXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghqMvbGky59jHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_k_2d8FQ3c3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg_qQYiL1e7ZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggUGDnRAEQYy3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggfRtqOpBkxgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggNnXWdPpcRW3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjqog_GKULDRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghYlkS6IWtLL3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"})