Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Some companies understand it wrong, you were supposted add AI to existing progra…
ytc_UgxIH0t8x…
G
That would make for a great music podcast topic. You just made me curious about …
ytr_UgzszD-c1…
G
This robot will be enemy of human. And i guess this robot should be not exist.
F…
ytc_UgyxbbEP1…
G
Let's not miss the forest for the trees. Instead of listing out rules for ethica…
ytc_UgwsPHT26…
G
2026.
\-Opens mailbox
\-Sees envelope from some government agency
\-Delicious…
rdc_m6yjm3p
G
Hilarious that Musk bought Neuralink and yet is "worried" about AI! Wonder how m…
ytc_UgxmdiTFU…
G
I find it funny that the AI "artists" don't believe they'll just be replaced too…
ytc_Ugwjxm6Ke…
G
there's something deeply disturbing about how he was handwaving away the idea th…
ytc_UgwgH27YD…
Comment
No. I hate to say it and be the bad guy, but all of the rise of AI is bad. The more dependent on AI, the less human we are. Having a computer do the things we have the right to takes away our humanity. Unplug all of this before it is too late!
youtube
2025-06-05T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwy8K9kEzdYa04R86h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw72hyjtOa6Okmoz6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1NOjRyMaWPsCryCZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9q9-AaLrN-5peve14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwuGEdVBE7N_7q1op14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcNRGNlDeX8v-mKel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-4XdhiCAEPgbfclR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRluF1b60e_04GnV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziRkZycKqQTBrrNMB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxPzqxARJyKksCBT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]