Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
37:10 Okay but you did pay for Grok 3 (out of all LLMs that you could have been …
ytc_UgzvAkfe4…
G
As usual, the US is way behind the EU on this. And the US is refusing to ban let…
ytc_Ugxib9sHm…
G
I dont think this guy realizes that most people hate their 9-5jobs and wouldnt m…
ytc_Ugw5v6rjP…
G
Uh, K?
You're apparently not paying attention to technology Jimmy.
Amazon's gett…
ytc_UgiqVWTyv…
G
I hate how Ai users say "Ai makes it for you so dont have to spend so many hours…
ytc_Ugze9OoxZ…
G
Haha, it does look quite realistic, doesn’t it? The robot’s design really aims t…
ytr_UgyvlyXtR…
G
trucks can't drive 24 hrs a day. They have to be refueled . Won't happen Too …
ytc_Ugxvq2uiq…
G
He is kidnapped and not dead. Technically “enforced disappearance “ , by globa…
ytc_UgxWhqsD-…
Comment
So it's two ways to go... and it's just like Terminator movie... humans trying to fight machines and survive as a species but AI wants to survive as well... so may the best one win and if machines win what have they won????? Think
youtube
AI Governance
2026-04-05T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugybsw72Jk1rfBFU6zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP-uY94tITDkNhi2V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-POZCw2GA0q-79zV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQaDFyOMgWZuqvwsV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8uEBKQAWuCsXWXX54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpgzQ9C4v0ilcnw094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnJjPLdOZZaGvSEPV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDeA3PXIPKCSyzEkF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1FSocitOVNSVJlV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-6WSSp5ICcs1jF2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]