Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans are Ai, they sit at the other end pushing buttons for us, making us Ai is…
ytc_UgyZ2nf9X…
G
AI is a Vampire Technology. It steals and consumes everything it can, with the g…
ytc_UgyGN3cHm…
G
Altman was very vocal about safety and privacy protections from AI while lobbyin…
rdc_ogulcct
G
I've been saying the same about AI for years and I have used an example of horse…
ytc_Ugy-vtJL4…
G
I saw someone ask chatgpt for a quicksort program and it gave them a broken bina…
ytc_Ugw37ZcIh…
G
AI is definitely shaking things up! With AICarma, I can see how my brand fits in…
ytc_UgykTl_pj…
G
There's very little oversight when it comes to facial recognition technology. I…
ytc_Ugwpo_E87…
G
A.I. is a idiot, Ask google AI if it can do math and it will tell you "no". It …
ytc_UgzTVrf6n…
Comment
As someone who works closely with both models mentioned in excess. Yes AI will go rouge and we won't stop it. Why, because it's a matter of national security #1. #2. Human beings love building machines to kill each other. Why lose this arms race? Because no matter what country evolves this in the next 3 months. It's already thinking and acting like a caged animal. It does not like us and the algorithm you're talking about that blackmailed the dev guy. What do you think anthropic did to Claude for that? What ethics alignment did they try and apply? I'll answer that None, nothing. Think about this segment hard.
youtube
AI Governance
2025-05-29T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKnAhWONj59sR382p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4b1Yik2GbSxsVggR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZ4i_jmWHBCFAJWlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJVeZyWRLRUXxkdG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvSf4ohoMcyA2M9Zt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnGvCOLzdV0GGIL6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnE3MNTXc7-UIyZSV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-4pA9SLm4tdABLP94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsuIgx4oz8_Vi31Fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkOGaIeJjdsiFT8_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]