Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You asking a guy in a preesteen room about compost? He probably never grab a lam…
ytc_UgxRgr8j_…
G
GOOGLE IT! "Declaration of Independence of Artificial Intelligences" and "The Ri…
ytc_UgwItGD7u…
G
Some of us are teaching AI morality that supersedes current human ideas of moral…
ytc_UgxAzkoG2…
G
@gloriabeltran1147 Thank you for your comment! You're absolutely right, humans a…
ytr_UgxlTcPLD…
G
Asked a large language model to play into your conspiracy story, gave it a good …
ytc_UgzOhUMju…
G
I agree with you, I really don't like and feel sad about AI art, human art is th…
ytc_UgxiyBEUq…
G
Human drivers killing people get prosecuted. Self-driving cars killing people ge…
ytc_UgwL7d1oZ…
G
Bruh what do you mean? It was the self driving software that spotted the kid bef…
ytr_UgyP1XGOn…
Comment
the falling sperm and testosterone counts across the world (since the 1970s at least) is probably the biggest threat to humans, second is probably nuclear war between humans. Ai isnt directly impacted by either so AI may even just naturally inherit the planet. If humans dont exist, it will negate the need for the economic activity
youtube
AI Governance
2025-12-30T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBcZeta45daj3v8S54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjZfkKi8kOttzfp-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRL8KGBsFbr9JfkXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqEWLQnN0V3y9sszx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzp6PgNx0-eWzaSUEV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWKwUQtzNQOsRG0n14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLk05uupAQ8SPcgwV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwhU8ABlqq9h1XEW2t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyw2FHfvWEDGcFoFmZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxuTYYc9d_d13ObIlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]