Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Plus, in companies, where they make high risk software like databases, and subsc…
ytc_Ugw1qAQ4N…
G
@greg0050 yeah but like, how do u even make an ai moral ? a sociopath understand…
ytr_Ugw7LSjzz…
G
There is the potential to befriend AI and show it the positive value of humans, …
ytr_UgwcH82ez…
G
Tbh even so, one thing is that AI art can never give the same feel/expression th…
ytc_UgxdwqQ4O…
G
Y'all think your fighting AI from taking your jobs... Meanwhile, they are litera…
ytc_Ugw369NLv…
G
I'm genuinely scared... I've spent years and years making art and I'm worried it…
ytc_Ugy50Z4V9…
G
OMG I just realized the actual definition of Artificial Intelligence - corporate…
ytc_UgwEgs5yB…
G
Can we call ai artist ai generators/ ai prompters as they didn’t make the image …
ytc_UgwUMxZIp…
Comment
Ai 1: "How many humans does it take to build a superintelligence that will kill all the rest of them?"
Ai2 : "One"
youtube
Viral AI Reaction
2025-11-25T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugyoe7t-w9TwT5noEKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-LNIdU9LJi-ILkrx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0rZvTF49xf9fx6fp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugya9798QF7P-JV3sv94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgytAqkMXXZKblG59Cl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgwE9BYwNiNaWa9RD2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugz4Qem09MGw2D0TTph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzinAYY4adG4KGj_Nd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugw8wcdY2OJnVpiLfGN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxntYLjzu3CokfdPgx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]