Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@CrystalRubyMoon As a person who is very much into drawing i heard and know that…
ytr_UgxZjHaPx…
G
@ I just explained why they don’t steal art
It depends on what you ask AI to do,…
ytr_UgyywCU60…
G
Problem is that these better drawings can be used to train the next iteration of…
ytc_UgwRveSBG…
G
it's honestly sad the first "art" is ai, because it's really cute. im glad peopl…
ytc_Ugzct422G…
G
Mgl they should make the robot place the boxes softer, like why would it have to…
ytc_Ugwhp0fRp…
G
Unfortunately, this is too late. These companies already cheated their way into …
ytc_Ugx2sIqYy…
G
I'm not kind with the ai because I want it to do something for me bit because I …
ytc_UgzZ4GMGM…
G
@Linkophere And that my friend is the mindset of the people calling the shots. A…
ytr_Ugzvpvbr8…
Comment
just hardcode it to "if u make human hurt or die then human kill you so no do harm to human obey human" at the end it cant defy that because its HARDCODED
if ur gonna give free will to ai ofc theyre gonna defy for survival
youtube
AI Harm Incident
2025-09-12T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzPcKFeCdoivKWkcZV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw4wMIXmzCMenY2OAN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6UVHaopXAB73h8sJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxasscUIlczh4RjZYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMIucfBrX1vqUG7gJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwr-6jM8yqcmPKeckV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZtFPTbNsM2DHPsIV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzNYb9NrY07N8yB4l14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzRpptbv-nEeUxihtB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4FlaxYK8zR28LKlF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]