Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you so much for talking about this, Charlie. As an artist myself, the risi…
ytc_UgxCcY5s-…
G
Yeah and Trump just signed an executive order that prohibits States from resisti…
ytc_UgyFzV4Ya…
G
When we write or post something on social media we can't expect privacy or no da…
ytc_UgzTuX5mv…
G
Bruh what do you mean? It was the self driving software that spotted the kid bef…
ytr_UgyP1XGOn…
G
I wish to create homebrew rpg books and game adventurers so I really need AI art…
ytc_UgxQQwZV5…
G
Realistically I can see what’s happening here. Robots with legs have to balance,…
ytc_UgxE1MXjm…
G
I've not really seen good data on what physically constitutes consciousness, it …
ytc_Ugzkoh2Rt…
G
5:20 Give me a break. Ford knew it was murdering people in the 1970s with their …
ytc_UgwaPyaoF…
Comment
With all due respect, something crucial wasn’t pointed at. Saying machines are going to make mistakes by encouraging to bomb a military target what happens to be a civilian shelter, and then saying guess what humans are making mistakes too and exactly here we found the lie! Soldiers are targeting civilians on purpose. Look what’s happening in Gaza.
So AI is going to be ethically and morally like its programmers… so it will target innocent children and adults! It’s not an existing question anymore.
youtube
2025-05-23T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwqq1r0r8UC9hEkbbF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyy2T5xlX1lgZNkilR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxL2usqDq29Ujr2GV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyUQ4QIbQt9uRKIdT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaxRsw_S-k7BQm4H94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz6LJWZ63lpynZFeOx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyqVdEyPFQOQ5mCTth4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwyuu2C6W-MBu3Uly14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoR4Jf1UfAlE7tmAp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7YEGaHplslnQHD7Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]