Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I generated a meditation using my script and an AI voice-over, but it didn’t wor…
ytc_UgyywK_pN…
G
I, for one, am ready for Fully Automated Luxury Communism. AI can take our jobs …
ytc_UgwrCweGT…
G
Regarding UBI: I agree that we need to find a solution that helps everyone stay …
ytc_UgwOgA27o…
G
I'm going to answer that question for you as someone who works with AI. No it's …
ytc_Ugxs83LJ1…
G
if a human artist photobashes images they dont have the rights to together and u…
ytc_Ugwp-eyRS…
G
This argument literally breaks my heart. I spent 3 days on a digital piece and i…
ytc_UgyyaKbZn…
G
This is the interview they play at the beginning of a A.I horror film. Nightmare…
ytc_Ugx3loZF4…
G
And what you wrote is one of the greatest things that give me joy. AI bros getti…
ytc_Ugy2QsftY…
Comment
Yes indeed... These technologies COULD be developed in such a way as to prevent so many of the side effect catastrophies, as she mentioned... but they WON'T be developed or implemented that way as long as there's a significant profit to be made from just carelessly using the tech... As long as money and greed is involved in the overall picture, then you can almost guarantee that A.I. tech WON'T be utilized in the best and most humane ways! (Sadly, and rather scary too!)
youtube
2026-04-09T18:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwO-csWxCF1-sB0TIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMLx1t7e2lI8sS4JF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyH3nAwyo-40hBCOcZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxrxw7Jl5wsWULP6SZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwNrkyGxGN4ujvYQAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWQ7UORr9MCvyTBth4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrqM0-eP7H3lxOXW94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQFiKPI3JEs7kzoHl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwO9ctVMwzeG5d7nCl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBH089GROAz4AP2Jt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]