Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hey wanted to pop in but as a disabled artist who gets rlly bad hand cramps and …
ytc_UgxNURn_2…
G
@zackarydev no. Mr stranger is correct. Think of it this way. Think of a t-shirt…
ytr_UgwOct2uG…
G
AI is promising, but waaaaay overhyped. I can see it being a transformative tech…
ytc_UgwR1SzXo…
G
💥💣💥 TRUTH BE TOLD, PLAIN AND SIMPLE!
SOON IT'LL BE JUST ELON AND HIS STAFF OF …
ytc_UgzNUkZoj…
G
the next thing to explore is this !!! The End of the World? John Lennox on AI…
ytc_UgxF1Heg-…
G
Yep lets get rid of automation - Throw out your washing machine and spend the wh…
ytc_UgwwLkrHe…
G
fatso here is not making sense and he knows it. an AI is like a well trained par…
ytc_UgwcKA8aY…
G
I think the issue is that you’re taking a phase ChatGPT used (and that we all us…
ytc_UgzUtAkp0…
Comment
This is a fucking stupid argument against self driving cars I am tired of seeing. Situations where it will literally be unavoidable to kill someone will be extremely rare given the improved reaction times/control self driving cars will have.
You know what would probably happen? It would do whatever it could to cause the best chance of reduced injury.
But these sort of hypothetical questions pointless: "What if aliens came down and said they would kill all humans if you didn't murder your child, would you do it!?!?!?!?!?"
A real person in that situation wouldn't even have time to react before hitting and killing both people. But no, lets keep human drivers because the trolley problem is hard.
reddit
AI Harm Incident
1504815035.0
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | utilitarian |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_dmp2zd0","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"rdc_dmp6sw7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_dmp9o7f","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_looxc0k","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_loq5txa","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]