Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jensenraylight8011 Have you even used AI to make stuff? I feel like there's a l…
ytr_UgwCeVRUI…
G
ai can do speed lapses as well unfortunately.
Humans are messy, ai is consist…
ytc_UgyElrOhR…
G
@ImWoolly It does.
How will you teach a robot to design, build, navigate, expl…
ytr_UgyN49kBq…
G
And this is why openai is losing billions alongside microsoft and amazon , or sh…
ytc_UgwVESYja…
G
@Andrew-ty6ll Lol, I can tell this is an AI-generated response just by the wordi…
ytr_UgwsKCcIj…
G
The Chicago PD Ai predictive study lead to over 200% accuracy over human police …
ytc_UgwLqQf8z…
G
this discussion makes me insane since its predicated on the assumption that we h…
ytc_UgxrNFGPf…
G
Okay but...we all know this. Just as we know there's no way that all humans on e…
ytc_UgycgGLWM…
Comment
Clearly self driving cars are not ready for prime time. FYI; The LIDAR system used in this car doesn't need day light or light of any kind, so the illegal dark street crossing fatality was totally because the car didn't react properly, or in the case of the UBER crash, it didn't react at all (watch the video). For now I'd suggest that self driving cars should only be used in certain areas, like freeways, if at all.
youtube
2018-03-23T01:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_cYfqNU3aDhdPlt14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwHuKYnGxVVo6rFHhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9rO2wNOT-ikBr4B14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzojAAVa671GkDhr9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKQyZyilGfh0x72Gt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgX2RvjccjA-YPstx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJ9c89mNiilIm2Ybp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEzCpiwo6VeTlj5jd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy63pl_ZifMbW-O96h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyGRU8kjp5e7zZ_SJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]