Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
here is a solution, don't give robots the ability to feel. humans have always us…
ytc_Ugh_Jnzba…
G
@MASKEDB 1. Stop mocking Bob Ross.
2. Please, you did throw your towel. You thre…
ytr_Ugz7xZLIq…
G
Great show as usual. I had a conversation with let's say, a top scientist to do …
ytc_Ugy1GaTg6…
G
So why are we listening to what a robot says about someone's health instead of s…
ytc_UgywnZ3nu…
G
In the near future, when we buy certain products, we may not be buying them for …
ytc_Ugx-GRC_O…
G
Your argument reveals dangerous naturalization of algorithmic discrimination—fra…
ytr_UgxdPJEGA…
G
Thanks for sharing your thoughts! While Sophia's answers may seem straightforwar…
ytr_UgwxO0VR0…
G
How is this a good thing? I prefer human art too, but all this hate? This drawin…
ytc_Ugy9FOcyx…
Comment
Just a gut thought -- but I'd actually prefer to see AI competitiveness than a one-world regulatory body. Centralized governance is horribly prone to the worst issues of humanity as it has no accountability with credible consequences. History is utterly replete with examples of that. Competition, however, has credible consequences baked right into it. It's not safer initially, but I think there's a different clamping that happens on the overall level of danger within competition than the inevitable corruption in a lack of competition.
youtube
AI Governance
2025-06-20T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxyEKGeR_Srwr8DOe54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyGURwqNcg2QvGmdUp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxn1tMrB6RHnSTwJmh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxY2vkLD3FqvJLz5X14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzpVEE8lZxi3dOQ7J94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy3gQtmy_J15lBxtxF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdmi_gcX2IVapqkEt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzxv2qN68jp7otRdxp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugy-CXUe9YqZq3TNacB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJWHfmomNW-SuK2Zp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"}
]