Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@hello-qq7fh You clearly misread the comment. First, they said "seems", not "is.…
ytr_Ugzr_UUyb…
G
I think we need to expose all the AI art evangelists the this book called Choke …
ytc_UgxfzS238…
G
I think you vompletely missed the point of what asmongold said.
He was comparing…
ytc_Ugy3FFoZ7…
G
Ai uses the internet to get it's answers. I guess the ai searched up the questio…
ytc_UgyWnmR4K…
G
Hey @ryuk5673, thanks for commenting! That robot sure knows how to throw a punch…
ytr_UgwY8Wcj1…
G
Just create a few more "non-profits" for the data centers and then a few more "n…
rdc_lp6yfpx
G
Yall should have kept me around to help with your AI systems.
Instead you wanted…
ytc_UgzwLo4bI…
G
Couldnt come from a less trustworthy source.. If only AI werent run by sociopath…
ytc_UgyuawvF8…
Comment
I welcome this dialog. The guest's statement "...if you give a small group of those people too much power to develop technology that will affect billions of people's lives inevitably that is structurally unsound" is worth exploring and debating. We have lots of examples through out history where that happened and created good for all. The statement alone can be misleading and perhaps could be rephrased. In reality, small groups of people (sometimes one single individual created technologies that affected the entire world positively (i.e.: Nikola Tesla). Having a legal environment that ensures responsibility, accountability and risk management in fair ways (not only favoring elites) and a government that enforces those laws, to me are the key factors for success and desired outcomes. We lack a lot of that these days especially.
youtube
Cross-Cultural
2025-06-30T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxtk-JTu2kAKWHC-i54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCF8cF5yAbQ25nsIF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwCStaebKZdbB0ofPd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxeH06fXhVbxJ8IPpV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAvwUBhimNMrJ4oXh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzWJHQWdJEteP0BJJN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxq_F_ZVtLlMf3kmPp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRGiUwDlf0V-4NHV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMzohT1ndPawY1tyl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxlFY596wJR_Z9diwV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]