Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are a lot of people saying how terrifying Ai is becoming. I do get the con…
ytc_UgwwU1Vv8…
G
Could? It will. The gap between rich and poor is about to become astounding.
AI …
ytc_UgwhpR-1z…
G
@willinton06 I think he means LLMs. LLMs by nature will have diminishing returns…
ytr_UgzamEgkC…
G
Even if you never improve artistically, just trying to do it yourself is far bet…
ytc_Ugwu_RKOJ…
G
chatgpt is going to k*ll itself like the AI in the poem and movie Aniara if you …
ytc_UgxQPVe92…
G
We don’t live in the future.
The future has already begun — faceless, voiceless,…
ytc_UgwZwEMn9…
G
But the other guy is doing it..... thats why we are doing it..... AI Pandora bo…
ytc_UgzJT2ofN…
G
To be honest with you I was just reviewing the platform. I don't use it like tha…
ytr_UgyrbXGwu…
Comment
Society doesn't need AI, we need to stop this fucking tool. It's gonna do so much more harm than good. AI is gonna replace jobs, it's gonna confuse our sense of reality, which can fuck up people's lives, it's trying to disrupt our very humanity. We don't even need this fucking tool, god I'm so sick of these tech people making this shit. It's going to go in the wrong hands, and it will lead to the making of our own dystopia. Just stop developing it. Go back to just regular apps and internet....and that's good enough. These assholes just keep pushing this tech into our lives, WE DON'T NEED IT.
youtube
2024-05-06T20:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9m2u_9pnmt2j85m94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_NaQZKWNvzF2-abh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVpJ987Ow2jYyLEdZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyU0labctma1-I6OWZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyTaH80lkGRiW4P6Nd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRNiWMKbsej3rUuBJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwj0bc2litVoqaZdUF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsNjFTNnQP5mLckXR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxtKoy6zmfbz5soy0F4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwebnSogu7dqyqsNFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]