Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can't have consciousness... its only these materialists who believe AI could …
ytc_Ugx_bgDX4…
G
I think in the next 5 years we'll see that this was not the revolution that ai c…
ytc_UgxfvgjVr…
G
Actually the scary part is the AI saying we are not there athe point to be conce…
ytc_UgzC50bUB…
G
"Transportation!" - reminded me of the Ismo joke, when we are about to be hit by…
ytc_UgzM0V4fG…
G
Ask yourself a question. What does a human organism have that AI will never have…
ytc_Ugx2vWpFL…
G
It seems the interviewer doesn’t consider AI having a higher intelligence than h…
ytc_UgyP7Xz11…
G
For his safety I think, there is an interview (forget which one) that shows one …
ytr_Ugxsnrs2I…
G
Bernie, professor Wolff was just on the Status Coup and explained the threat in …
ytc_UgwqTfOhd…
Comment
One thing that no one mention....does not matter how good the set up is (software, car, brakes and weather) - if anyone jumping, running or stepping out point blank into a path of a moving car (BE it a human or autonomous vehicle) you going to be hit and maybe killed - with an autonomous vehicle the thinking distance could be smaller vs a human but if you do not give a car enough room to slow down/stop people are going to get themselves hurt or killed and you never going to stop that unless you make all cars drive less than 20mph or 30kph
youtube
2018-03-21T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy0iRDurE5kbF4u47t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUw_ET2UI1DH2_rDl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzirg4tn7e-n962m7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl7LgqTG_8lxdhl_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV8ZzJyidy7LHYPH94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQ7IMYr_Toyf73iRN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFQTYdACGwhyoXqDd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzexROj7pjdcfp_2Hd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3XOT-yQgUw7lYYRd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwHFJjq3Ja3_hjQjfJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]