Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait ... She was arrested for an alleged carjacking of a guy that admitted to be…
ytc_Ugwy0FUKa…
G
>people that have zero experience coding
"Senior CS college student, Just sa…
rdc_kupj8u6
G
Even if you never improve artistically, just trying to do it yourself is far bet…
ytc_Ugwu_RKOJ…
G
This video doesn’t make much sense.
Coding is just one of the use case of AI, w…
ytc_UgyaAMr4o…
G
The argument could be made that AI art simply proves the death of the author con…
ytc_UgwCRm3yI…
G
So wat if the robot doesn't give back the gun was next? Robot: shoot the white m…
ytc_UgwyyyR3w…
G
As bad as the current AI is, 5% success with projects means thousands if not mil…
ytr_Ugw9kPiDf…
G
ok you win the moral argument.... but ai is still insane and everyone thinking t…
ytc_UgzbXNoqr…
Comment
One thing I'm more curious about, is that they show the chats response but not the prompt that led to it... almost like they don't want people to know what they said in order to GET that response.
Reminds me of the big scare about the AI model escaping, when it was... told to escape through any means possible.
Anyone getting similar vibes that has to do with salt and rats? Or is it just me?
youtube
AI Moral Status
2025-12-13T22:1…
♥ 78
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxfgu9ZOBEFH_9shMt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4H9kI_wBNbFc0nbl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7AV_r02MH-l5jYZp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyH6ZqdSp2mTNWLcDx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG-j9Ww7x7Adjjd8J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzzbnTfUkZ_-9qOnN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw5JsAf3_3VC1eWokZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZteN0h-nZxPY7r5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8ANJCGUxzKWNsTS94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzoAoTXXcLxlC3oNDV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]