Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had to get off the code learning subs because I just couldn't have another arg…
rdc_jabfacd
G
I would rather spend my time learning how to draw rather than using an ai art ge…
ytc_Ugz8THDhJ…
G
“Will make the world a better place.” -Often said by psychopaths with a fringe m…
ytc_UgyJz3MUO…
G
I seriously don’t get the argument some ai promoters say when it “democracies” a…
ytc_UgzpFO6EV…
G
Honestly, when I started drawing, it was only because I had so many ideas that I…
ytc_UgwaZ9_rw…
G
Even though I disagree to the starting argument, I appreciate hearing the other …
ytc_Ugyc0n4OU…
G
So interesting that you quoted, China, Russia and Iran could use AI for wrongdo…
ytc_UgzjKYHZY…
G
It’s amazing we just sit here and wait for AI to take almost all jobs from peopl…
ytc_UgzRBKYya…
Comment
Random Ashe Consider this, what if you were driving your ‘autonomous’ car and it was your family who is in danger. That is the dilemma that we are presented with. As you stated, you want a car that will prioritise your safety. So, if such concept of a situation were to happen, what decision would your autonomous car make that would guarantee your personal safety (as all cars would), but one that you’d compromise yours for the sake of your family.
The autonomous car wants everyone uninjured, but you’d protect your family, and your autonomous car would (as you stated) protect you.
youtube
AI Harm Incident
2018-08-31T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyF6F5mYMV-Pa5y5GZ4AaABAg.8hdOXOKKyqR8kbvmeVyUHC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkhoJUXCOdqrrdO_F4AaABAg.8ZuDxkqG2qH98bK0NRx2AW","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgwE0M-nX5hQSlgJMIt4AaABAg.8ZLDyabu4ux9mLu7ZsOo7e","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-Fc9mLtvosPf4W","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-FcA1PIA3250ul","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8NwFFi8Pqfy","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OAIyM-R3lw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OASggjEx2E","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NolwAfwMh6","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NpIBpJMJ3W","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}
]