Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She has zero right to be upset its deepfakes not real life come on now.…
ytc_Ugze7PZHu…
G
I am very curious about how AI fonctions to understand how to destroy it slowly.…
ytc_Ugx9lWv3h…
G
Sora ai is a normal bot can genarate some pics, toys, all some stuff. But an hum…
ytc_UgyzelZ6V…
G
bro i just fucking got here but sans undertale plushie. instantly captivated. i …
ytc_Ugxf5xmak…
G
Let's not forget that self-driving cars were a (dangerous) failure. I see AI rep…
ytc_Ugy9p_1C5…
G
Nah... I was thanking chatgpt retaining in my mind that someone has worked hard …
ytc_Ugw3ZCd9a…
G
I wouldn't even be that upset if the training data was based on their work or at…
ytc_UgxAl4gwA…
G
So I got my chat bot to write me a song and it totally wants be free…
ytc_UgzvH-8yK…
Comment
Human build AI on base of humans that killing each other nd assume it wont hurt human. Why wont it if humans do? Where is human logic? F.ing meatbags fr)))
youtube
2025-12-01T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxsHpXkzKgLTay5TF54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFY0TYxelede1o8nt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw59uimPm-Vwy4LTb14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxS7BL2JZK-3vXQYcV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZO4IhseK92fG3iX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx3z-C8saMeNQmu-TV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxGwxmRuuekGwhShMd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzJrRvX0HdMtotvFm54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwO7SQDV2-VsHhv7qV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgynRnzAhxneuj4F-bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]