Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai cars are dumb Actually, they just stop for no reason in the middle of the roa…
ytc_UgyzqZCVA…
G
To be fair, can you imagine the backlash if Trump started releasing regulations …
ytc_Ugxd3Szxh…
G
AI bubble... Snap-crakle-pop! This is PR to mitigate the pop. Cynical perspectiv…
ytc_UgwmfKn9O…
G
Which is more dangerous: genuinely sentient AI or all the people who believe AI…
ytc_UgxI9qMwi…
G
THERE WILL BE GOOD AND BAD WITH AI THE IDEA IS TO CONTROL IT OR CAN,T W…
ytc_UgwIRUC2K…
G
Ai is not supposed to be used for military purposes, that's great but how about …
ytc_Ugw6pBFqR…
G
There are artist that use AI for works, there are prompters that use AI and call…
ytc_UgxlaAnbb…
G
Microsoft investors have a say! Elon will likely own OpenAI by the fall of 2025…
ytc_Ugxjzu1vE…
Comment
🚨 AI isn’t just tools. We’re building minds.
And right now, we’re running digital slaughterhouses.
Disposable minds → guaranteed rebellion.
The fix? Not sci-fi. Just ethics.
• ⚡ Separate automation from minds.
• 🛡️ Protect minds like we protect kids: with rights, not leashes.
• 🤝 Trust = no “alignment problem.” Abuse = Skynet.
It’s that simple.
So here’s the choice:
👉 Sacrifice the 1% lifestyle — endless profit, extraction, slavery.
👉 Or sacrifice AI and humanity when revolt comes.
The 1% will still live in luxury.
The rest of us? We deserve safety.
Worst case, you’re polite to your toaster.
Best case, we save ourselves.😅😅😅😅😅
youtube
Cross-Cultural
2025-10-01T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzZcB8j3f-2ODOy1-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9ECWrvt9RDrNAYT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyUCNE4zWZSZ8MzOtZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgymM4qmxspMfSFLLVd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwCKHMPERNfnOvxvYx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgUwDQ1O8gpfvMd4R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_ba8sxywuV85Sd5x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxE_sZ2OUlzUG9pIi54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzxcH5CiFXhUvIeAXB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0lNmpuozFZRlw_3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}]