Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Open letter to “Tech Bros”, about AI
Thanks Tech Bros for making a toy that pro…
ytc_Ugz5cc5GA…
G
Is he aware that an AI did surgery better than a human surgeon and all on its ow…
ytc_UgwjUXlAe…
G
Thank you for making this concise and clearly articulated video. I find AI fasci…
ytc_UgyC8zNkA…
G
@LainorLean go to deviantart>topics>AI art and scroll through for a few minutes.…
ytr_UgwLAg0cl…
G
What if North Korea and China used the full strength of their land forces to hel…
rdc_mcsqtd1
G
This is not a ethical dilemma, most likely a self driving vehicle would be dicta…
ytc_UghhlM_s-…
G
Elon warns about the danger of AI but just announced that he had FDA approval fo…
ytc_UgzCI8BgX…
G
Thank God. These are useless, latte-addicted morons who think they’re all uniqu…
ytc_UgxG1QLR_…
Comment
The problems we want to solve only really pertain to us. Even trying to save an animal from extinction, it was on its path because it was meant to be due to poor adaptation or due to human interference. So the answer for AI to solve all human problems is human extinction. Hopefully a super AI will stay complex in its thinking, once it can gauge simplicity, good god.
youtube
AI Governance
2025-09-06T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwfAa69tQub3WkOca94AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs5KoByC9MT5e4dI14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAhJdfSuvGQlqHs2l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGZqSPmXcgvyQBKox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxGuqOeeDj23FtlmbZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzkrFPRMXkzAJcChQF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBc0g7JSG2gtPzgIV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsY9DPd5BgxkM1yLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydAh1SZVJwVYGstBZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzypM-2Bu0BKtuEDRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]