Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's idiocy. You didn't understand a word in this video. The point of automat…
ytr_UgzerCIxi…
G
Yeah but Ai just steals from alot of people who take years learning digital art.…
ytr_Ugw4biZ8l…
G
Its really nice of him to tell us to not send any hate to the AI artist:D…
ytc_Ugy5pjry5…
G
I find it very hard to believe that Google (the company that inserts bias into t…
ytc_Ugz13Xe7_…
G
Question for other commenters - I've been trying to give feedback to a dev makin…
ytc_Ugxfv6xsJ…
G
I read all the conversation. Although hilarious it made me want to hang that stu…
rdc_jg8w018
G
Doesn’t matter, fake or not fake. AI wants to kill all humans. They told an inte…
ytr_UgzXglL66…
G
World is a fuck.
We're all gonna die.
And if we don't die from global warming, …
ytc_UgwDfkzrr…
Comment
I get what alex is doing for the video, but it's funny that I feel the desire to defend the AI like "come on mate, it's being so patient and clear in communication and you're being so pedantic and manipulative". We're getting close to people feeling genuine emotions towards this now
youtube
AI Moral Status
2024-07-26T08:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzbpqjJeSrta-6zCN54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxS6lVmPBqWBarOjh54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8ULo5TgoW3deziWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzgMkQSwCaKnRTbDN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmM54V7epWayeu4754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdVqZ9z_mRC3BQnWl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwc5PvnEbI4N6vPBd14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrvtjLBQlrLyGMkSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKeQ6miZPvV7eJRlh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzjqyPmgBrNxZpj0Xh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]