Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just another way to control the population. Everything comes on trucks. Control…
ytc_UgxViKYS2…
G
NOBODY wants ai. Its the billionaires who funded their friends needing to get a …
rdc_nug176x
G
Elon is so right and honourable about Open AI...I totally respect his wisdom ! T…
ytc_Ugx2sU2Lo…
G
Heck I can't tell if the hosts of these shows are AI... they look fake to me…
ytc_UgwLjALGv…
G
This actually doesn't impress me. It's minesweeper. Gotta be a million and a hal…
ytc_UgxgHgQGh…
G
If women start doing deepfake gay videos of them, will it help them understand t…
ytc_UgzJ0nYq_…
G
'A child in Israel is the same as a child in Gaza', sure, physically. Bear in mi…
ytc_Ugw4oIIvx…
G
i agree i think. i miss when AI art was self-aware that it was made as a joke be…
ytr_Ugwc4viyA…
Comment
I get the complication involved here, but what needs to be understood is that in the situation described here, an accident is bound to happen. We’re focusing on *who* we want the accident to occur with. The way I see it, that’s inconsequential. A self driving car is designed and programmed by humans, so some our humanity is transferred there. Including our disability to make rational decisions when under extreme and sudden pressure. I say we don’t try and micro-analyse every decision a car made during an accident, because that’s the same as micro-analysing every decision a human made during an accident. Which is hardly a fair thing to do to the human. The car was made by a human, it will have human limitations.
youtube
AI Harm Incident
2022-08-01T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]