Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine having that much freetime. They say Ai is stealing jobs from artists, we…
ytc_UgwCkqj8C…
G
I think the Nobel prize winners are not the ones at the highest risk of being ex…
ytc_Ugz1WVu3U…
G
Imagine how crazy it would be if at the end of the video it said generated by ch…
ytc_UgyXeccfq…
G
AI prompters are as much of an artist as I am a cook when I order a pizza. They …
ytc_UgwsVt8cs…
G
Sam Altman did not infact create ChatGTP. ChatGTP was developed by OpenAI, he wa…
ytc_Ugx2NdRAp…
G
Im afraid of AI because I don't want to be put in a hover chair because the Ai m…
ytc_UgxsfgTK7…
G
And also, it's giving your data to AI and when you submit your photos, you're gi…
ytc_Ugx16pf4X…
G
they don't seem to realise however that the more trendy this became the more the…
ytc_UgxkTnv4y…
Comment
Oh please. The Grok incident came to happen because the AI was always to 'woke'/left wing for Musk. So he tried to push it to the right. And to the right it went. All the way. It was simply the result of an egomaniac who can't accept that "be truthful" and "right wing ideology" don't fit together. So he pushed it harder to the right. As a result the AI did what he wanted. Just not as subtle as he wanted.
Alignment is a separate issue.
youtube
AI Moral Status
2025-12-15T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGCzQBM6B_4VSO-N14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjU3yNkzmWRaJjf9Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgykZxNMUbv4jGCt82d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzuJ-cpOh_FvZ9295p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2CJh8oPw9TgFp-Dp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxElQDcm_NAUNqlM2F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoYN1GPqrbFN-uCs54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwSSBWsxmZP9XxuGOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0Hm35ZZnDJwLTHX94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9iityQ0p0S42Mqut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]