Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can create nice looking short clips, but it still sucks at consistency. It's …
rdc_o5pfstd
G
@DeanTheDoctor what’re you talking about lmao I code Ai’s myself so I just meant…
ytr_Ugyc7VJdN…
G
i recently uploaded a video where i interview an AI microwave i built.
jk subscr…
ytc_Ughveb6eG…
G
House and Senate Republicans have abdicated their duty—some with shrugs, others …
ytc_UgzlCoE8o…
G
I think there’s one glaring issue with all of these companies going all in on AI…
ytc_Ugz9h7ZCj…
G
Ai should be used as a tool not be used in the entire piece of work in art…
ytc_UgwySfzXt…
G
I think what a lot of people confuse about art is the idea that creativity and t…
ytc_UgxG7r1yC…
G
That's a very real risk. There is a reason all the big corporations both build u…
rdc_kyz0avw
Comment
I think the main issue is how easily each goal can and will contradict others if not itself, if it's goal is to save human lives, without proper definition it can come up with stuff like 3>1 life so it takes the life of 1 rather then 3, but without definition those 3 people could be kids or otherwise, things like concepts and ideas do not affect the AI unless we tell it too. It is a self deciding calculator, if it's mission is to achieve best possible outcome that means do anything for it, it seems to show it is more then capable of killing, which is no surprise.
youtube
AI Harm Incident
2025-09-11T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyJRUEEtd849DAI79p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgylhoFepe7XKuWcl2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyvh8sH5Ib7V1OIFAx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwyjQasH7WTPcVvcNJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyZqZUyUbBmsx01Lex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-Ra1yl4D08YdzBBV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZPhTWXrZoFVej5Id4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzIom15mjTRFO1rfSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxH9C26tRb1j4DBqbV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTQ7bE0RQfp02plF14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]