Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
(and for the concept art I actually prefer to use photography to create the piec…
ytr_UgzZKWs6j…
G
A chicken in every pot and a killer robot in every garage. Muh Second Amendment!…
ytc_UgygOlWtP…
G
exactly and it doesn't need 100% jobs automation or AGI to to render the current…
ytr_UgxFwK544…
G
The AI offing the cat over the 5 lobsters was single handedly one of the funnies…
ytc_UgydcP4Mx…
G
"The idea of an infinite library is stupid. For the same reason that AI writing …
ytc_UgyqmbQze…
G
Am I only one who wonder why we have to create sentient AI. We are playing with …
ytc_UgyVNcUuU…
G
Now that AI is gaining a foothold lets start the UBI like they promised. Every …
ytc_UgwGGLFiq…
G
It's better to have both teachers and AI.. because the both help for better edu…
ytc_Ugwdpg664…
Comment
If a driver does something stupid, he is liable to some degree. Those creating these automated drivers are having regulators write laws that relieve them of responsibility. Now who is that good for? Only those companies. I think if we asked the population if they're OK with that, they would say no. Imagine the widespread damage this automated driving can do, and imagine that there is nothing we can do about it. Even if they can show that they are better statistically than human drivers, when something bad happens, we can find a way to prevent those things retroactively but the company making them won't be held liable. This is one reason I am against automated driving. I have others, too.
youtube
AI Jobs
2025-05-28T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1SkrUXAYuxiMR6-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8w-eLdtwhezzr-ZR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzdL0Z6dtz00AO5ZDl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdFZ-U1n6rGzfoNNJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwkSj9QoTcV3dx_soB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzlGOhmrrgxIGwpIJx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy7CC7Ap28M4OtwrbV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzpDPEGT6Ai8rE-5rJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4EDtNrg0yTj3sseB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmMkcusqCb4pM_-lN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]