Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel terribly sorry for the victims of the crash, but no way that the cause of…
ytc_Ugz1ZlrNl…
G
Every video about AI for some reason reminds me of a video talking about how a d…
ytc_Ugw8JPrBK…
G
i cant believe there's an actual fucking subreddit about defending AI images 😭
u…
ytc_UgyCaPKY4…
G
I heard that a guy was delusional from chat gbp telling him he was a genius abou…
ytc_UgwyIhsD0…
G
That's weird that AI seems to only target black people. It makes me wonder if th…
ytc_UgwnlqKBC…
G
You can use open source models like llama 3.1. If the service you run it on is w…
ytc_UgwmaIxDl…
G
Big Chloride has it claws deep in this guy.
So instead of being sheep about br…
ytc_UgzoTwDjl…
G
There's no more issue with AI training on a copyrighted work than there is with …
ytc_UgwIUBpyv…
Comment
No matter what robots can never over shine human. Why because they use motor with coil. Wants you pour fuel on them and light them with fire. All the wires will born. And the robot will fail. But some human can survive fire burn. Many have survived it. So robot just pour water on them and they wld fail.
youtube
AI Governance
2024-10-20T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4wCWXzWIPS2Icfdl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_a4ja6ddmeLZz1GJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4VIYCHmWiY9O98QV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgztD5McpYouOkwFu5Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxP8f5OkEAScHuIux54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6tTPBk8IGTL3meQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4clMtWzNLgvbVZbV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzqw7C1Gf-rXvVXNB54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyRNr2Ke8K52o_e0pB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKfFvUA7Zs11yZONZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]