Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
why is it so hard to teach people programming skills? it's like people are doing…
ytc_Ugz1hDUkO…
G
In my opinion, AI art/writing should only be used for fun or for reference/inspi…
ytc_UgwInMXvo…
G
Are we destroyed by AI??
Let’s share comments with me.
To people all over the wo…
ytc_UgzqaaYxX…
G
Not jumping straight into their traditional school work but instead starting the…
ytc_Ugw3_yTJ0…
G
I would have gotten in the front seat step on break lock the p break and put the…
ytc_UgwWngqlb…
G
Bro the US government can't get PirateBay down for more than a few minutes 😂 the…
ytc_Ugw5MjHsA…
G
all men think power money for AI powerful men dont see the danger behind AI have…
ytc_Ugy6zR9RM…
G
That would require a moral sense in ethics, Which many dont have today. Besides …
ytr_Ugxz25xkh…
Comment
I don’t think it will kill everyone - it will keep a small amount of people in sanctuaries for 2 reasons, some humans will be able to do things that AI can’t and the AI will want to study it or they will have a sense of beauty like we do with wild animals and nature. The problem with the sense of beauty is it evolved with humans so I doubt it would be in AI.
youtube
AI Governance
2026-02-01T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgziQvlqc2yTA8IAROR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwK65UEaebBtX6HK1N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYktEcmkAKWoNgkJN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwsYKxS4n7TO1ExGxJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzslie877F3k5anq6x4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyl3EuC1ndYutyvC8B4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugylc20EAl8lJ_u2MPx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwoYLR8i58a34cE8yR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwA8nhz79b_AV9aiHh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcqLMdPfIAIh0KG2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]