Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Weinstein is the only guy on this panel who is asking the tough questions and ad…
ytc_UgyUk8z3E…
G
Today, I completed this AI course. This course has helped me a lot and explained…
ytc_Ugy1-P8aG…
G
With disabled artists one of the gross things is that the pervasive push of ai a…
ytc_Ugyn8u1Ap…
G
Нас также когда-то Бог создал по образу своему и подобию!
Человек ничего не може…
ytc_Ugz4IEAck…
G
digital art, unlike ai, is not a substitute for skill and talent and passion. ai…
ytc_UgzmVOYfb…
G
I was in an assembly at school and realised the majority of the images Where bla…
ytc_Ugzd_5vqh…
G
Donald Trump was already pro-AI since the beginning. His supporters loved AI. In…
ytc_UgzACpz26…
G
if you dont realised it by now: There is already a billion dollar industry in AI…
ytc_UgzzohmA-…
Comment
I wish Soares didn't get to hog the spotlight, as I think his contributions to this discussion are the least relevant. He sounds like just another doomer saying "we're all going to die" while offering no solutions. He wants to sell books, nothing more. There is no getting off of the AI train at this point (someone will eventually make an ASI), there is only working to make sure it is aligned toward good. I would trust an AI that is aligned more than I would ever trust a human. Humans are currently murdering children in Gaza. Humans kill you for being born with a different skin color or for not worshiping their imaginary friend.
youtube
AI Governance
2026-03-23T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzS7fsh-Ec4BCD5t0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGiPJ8xVUsgR8PxEx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzBFP5uPi4A2q9JDr54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1xbXHHipJ0SR2C4F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzz8Tntn2azqgB-Rkx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwWfP8Hcvn4BWCgN94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzlK400lPyNR5b_hMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgziufZeSDhhTEwYuRV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmU98LGz6963ElgG14AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgytFlf8KB__oc0tWK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]