Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everything is opt in until it isn't,it only takes a rotation of 5 months to norm…
ytc_UgyzJ3lUO…
G
@Buttcakes15 OpenAI used to be nonprofit. "OpenAI is a non-profit artificial int…
ytr_UgwcO8CRB…
G
It's quite humorous how many of the comments act like practicing medicine is "in…
rdc_fcsy1eq
G
One of the big problems with discussion like this is the anthropomorphization.
…
ytc_UgynkSjGp…
G
In another 10 years AI will be so good it will be completely indistinguishable f…
ytc_Ugz0wEvn4…
G
Allowed in states with LAX regulations. That's the key and Texas is full of them…
ytc_UgzH30g-F…
G
if those politicians get deepfaked into saying feminist things, they will ban de…
ytc_UgyQ7niIA…
G
If anyone heres watched what happens when you put several AI models against each…
ytc_UgxuThM7Z…
Comment
I understand the ethical dilemma this video is trying to point out, but this is a horrible example.
Self driving cars should not be tail gaiting cars in front. If the car in front suddenly stopped and/or objects starts falling off of it, with enough space in between, the simplest and most logical solution would be to hit the brakes to minimize damage on everyone including other vehicles.
youtube
AI Harm Incident
2022-07-24T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]