Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The one thing I don't like about this is that this is all from the perspective o…
ytc_UgyCludCQ…
G
Thanks for the information. The TOS and copyright issue are 2 important factor d…
ytc_Ugz0In2x_…
G
Asking ceos about how things are valued is the problem - they only master how to…
ytc_Ugw_qlhC_…
G
They are currently building the AI infrastructure for the Antichrist and the abo…
ytc_Ugxi_oP9t…
G
"An AI cannot be held responsible. Therefore, it must never make a management de…
ytc_UgwuSKQu3…
G
i gotta say, when they, idk, people, the media, say big tech, or hey instagram, …
ytc_Ugwctbies…
G
The guy trying to market and sell AI art is disgusting. But setting that aside, …
ytc_UgyZaiFlW…
G
This is why i stopped trying to force my mind into doing game development and de…
ytc_Ugzur5FwC…
Comment
Wait, forget AI being a threat to humanity. The DNC was giving away free abortions and vasectomies at their rallies and pushing puberty blockers and you are saying Musk has no moral compass? OK there buddy. Musk warns about AI and protects freedom of speech, but come on about ending humanity. That’s the job of liberals of one side and Islam on the other. All other people with the drive to have families and relationships that have dynamics outside of Islam or transgender identity will be squeezed into the middle and eliminated . But why call the man a problem because he made more contributions to everyone than most will ever do. Sorry your butt hurt about that.
youtube
AI Governance
2025-08-11T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwDfox2ehZr4UMdU2B4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwvqkT6eZB9YZLjIwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7mjx5iPk3BHRdgvZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxkG4mMwtIoTEeo6l4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxlh4444vyymgCTck54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwonBo1bcGmvlmjV914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpGJbPzOPkTMpKaYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugyb4jfb7l6RbK5RZ8F4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNlawkUR_Ga_TUGkh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9qli2xamRyHmUNAx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}
]