Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can I use this AI software to give myself a six-pack and use it on my tinder pro…
ytc_UgxddOo-H…
G
Well, he solved one of the problems with generative AI. But the main issue is th…
ytc_Ugz7dr3WE…
G
One magnetic pulse could stymie AI takeover of humanity. Keep an EMP in room wit…
ytc_Ugy8KGIth…
G
If this thing can't detect a motorcycle what is going to happen when these robot…
ytc_UgxlcLe3e…
G
i always thought to myself whether AI is the antichrist or not
whether you beli…
ytc_UgyuvUDK4…
G
I believe that, because all these claims don't match my experience with AI at al…
ytr_UgxcIplLB…
G
I'm sure they wouldn't lie and actually have someone sitting in a little booth t…
ytc_Ugz_0oRji…
G
Anyone who thinks that AI will make all/vast majority of jobs redundant, fails t…
ytc_Ugxe5Mo4g…
Comment
Rather than questioning the ethical issue of what is “right” or “wrong” in terms of fairness, instead we should focus on what works and what doesn’t. 2 elevators never crash into each other because they’re not designed to do so. I agree that cars should be able to communicate with each other so that they can act in unison and at least minimize accidents. The design of the truck shouldn’t allow for such faulty things to happen. It’s also unethical to have us driving manually when it’s safer to automate it, and that goes for jobs in general as well.
youtube
AI Harm Incident
2020-12-21T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxzrJFHtwQVGAhSoTR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxU5tCZM8HakEDvsFl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMY4iDTn7n18gY6-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4ux_V7SfOPVtCZ9V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxScl_CUE54ZUahNwt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxSt_7nWVXXZuuXwvV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwalSdv_7oC0WE64eZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzo1Jhy8gfrjzTAU1Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxdohbhxra2IeZqFE54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxx9whhaDWkfEJOjy14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"approval"}]