Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm excited to see the new species that will evolve to fill the niche leftover!…
rdc_degftlu
G
Gee, I wonder if AI actors will also be preaching to us on how to vote and be su…
ytc_UgyS9isJl…
G
This actually breaks my heart. Creation is the kid important things about humans…
ytc_Ugww8Pyrh…
G
@vishnu-t7b3j Most people don't realise or believe that AI will be the cause of…
ytr_UgyTdoW38…
G
This is only the beginning — Just as long as Artificial Intelligence has no regu…
ytc_UgxsT8WFx…
G
I'd love to know what prompts were used for the crazy girlfriend, would be cool …
ytc_UgxyML_c4…
G
But don’t all artists learn from art in different ways? If so, isn’t AI learning…
ytr_UgxR4_bpv…
G
Thank you for sharing your interpretation, but in the context of this video, Sop…
ytr_UgyA6Vw9i…
Comment
I stump Google AI today absolutely no context on the matter. it a first for me. usually it tries to convince me that I'm wrong. this is a human contidition not AI. here you go "AI will not be safe to human as long there is one evil programer or hacker, just think of aIl Terrorism or a whole country of terrorists that rape, beat, and kill there own people like the middle east. we can not stop terrorist attacks completely . they happen all over the world all, we cant stop them unless we stop there indoctrination."
youtube
AI Moral Status
2025-08-26T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzpJM16cXJj7RdyXZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-ibpMyHjVvQQO3q94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyYMElQFK6FE36adpF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyJUJBKo0y7BTUDMSp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5RyHpWlofN_hPC5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztGHFa-aWRqMLBHox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxbm1908YYWU7gAFAp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwKA_eEdDGVVN0c9fZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8IKq9MY13R9kK3hF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWeeS7pNNRc7Jvq6N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]