Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
6:34 i want to draw a comparison to my toyota in this section. Toyota does not h…
ytc_Ugwq6VjPV…
G
Tired of the "automation won't replace humans, but work with them" nonsense. Ent…
ytc_UgzzViOHw…
G
"Alignment? Aligned to what? American rules? Chinese rules? That illusion must d…
ytc_Ugxo8mdp3…
G
At least the AI in ‘The Matrix’ were more creative. Instead of eliminating human…
ytc_UgyfHXlgz…
G
AI users tend to be the same ass hats who drive BMWs with daddy's money. They co…
ytc_UgwtBQnnQ…
G
Someone please tell me why we need AI and why these people are trying to hard to…
ytc_UgzDcfduI…
G
Thats incredible man. I never seen the inside of a university. I got to tell you…
ytc_Ugxh0VIKf…
G
Art is like giving a kiss, drinking water, enjoying a walk in nature , reading a…
ytr_UgwJe8Fpy…
Comment
I'm sorry, but most of the logic of this guest makes no sense. The proof that we are living in a simulation is religion? Please, there's nothing scientific about that claim. Lots of assumptions. While AI is definitely a risk, this guy needs to revise his assumptions and contradictions. He assumes everyone wants to live forever, and he believes AI is the key to achieving this goal, but at the same time, he claims that AI will destroy humanity. Too many claims, too little logic and proof.
youtube
AI Governance
2025-09-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx06jWp559Kas_jJ8R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwwN49z03zSmddoNgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfIHnlDn7VkKpdWzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugym5sY6KBCgB8oR0t94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2ako-f8a-00Svg314AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuixos3evi4dpEXSx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGKgqESre5yjqzX8Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVmZQsn-l4LLCaj0x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwiLB95X2GSHB4TfS54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugycp3NTGYMZpXNQ8up4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]