Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I recently took on a graphic design course. I'm a 30 year old mother of 3, I hav…
ytc_UgxMt85E5…
G
cool, so they are committed to AI bot decisions?
It's a question of life and…
rdc_kskumfv
G
The real story here isn't just that researchers are leaving - it's WHERE they're…
ytc_UgwqAmQrB…
G
Theoretical foundation of modern computer algebra arise roughly in 1970s to 1990…
rdc_nor8k2o
G
Come on over to the federal government because it'll take them 50 years to do co…
ytc_UgzewfBAy…
G
Well, buddy! You might be in debt 25.000 USD and will never be hired cause every…
ytc_UgzmjISoR…
G
On top of this, it seems weird to make an article focused only on men with this …
rdc_ohz900h
G
To me he seems to be making a lot of assumptions about how neurons actually work…
ytc_UgwuwLfvS…
Comment
if u let me think a bit outside the scenario, I would say that all cars should be self driving, but with a manual mode option. even while in manual mode, the car should be able to take over in extreme situations. this way, surrounding vehicles would notice the falling debris and all would react together to save the car behind the debris. if staying strictly within the scenario, where, presumably, the cars surrounding the subject are not self driving, there are some tough decisions to make. I would say the car should first of all work for the driver's safety, as all cars are made to do. regarding the surrounding vehicles, i do not believe anyone (programmers) has the right to chose who dies, so randomization? that's the best I can come up with for now
youtube
AI Harm Incident
2017-06-22T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]