Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People who don’t talk to AI models politely are the same people who abandon trol…
ytc_UgyELGINJ…
G
The moment I realised I can ask an AI LLM the following: "Can you summarise this…
ytc_UgxbudFk3…
G
My thoughts: We’re not doomed. The trailer with the monsters was visibly messy, …
ytc_UgxsfEBTE…
G
It is not that AI will "take your job", it is what is the alternative for us, th…
ytc_UgzrFTiiQ…
G
I'm a professional writer. My work is going very well because I work high-end B2…
ytc_UgxCh1ulr…
G
That's such a huge problem. They don't see art as actual work and skill. They do…
ytc_Ugxogt-RM…
G
It is really cool to learn art. Just don´t get arrogant about it, and accept tha…
ytr_UgxyPqlgA…
G
All the major countries are investing in AI and OpenAI is a major vehicle for th…
ytr_UgwAhjFyf…
Comment
Well... We are literally the only species that is bringing its own species to extinction and also taking the animals and planet with us. 😅 I see nothing bad in AI taking control.
At least if that happens, the animals and the people that actually live environmentally friendly will be safe.
I don't have social media and I only use Chatgpt once because a friend insisted, and the only thing I could possibly think on asking "it" was "Why humans are the most dangerous parasites on the planet" and it gave me a very good answer, but that's it.
Is just a tool...
Is not as if humans need something artificial to paint, write, create... Needing an app to make basic things is pretty pathetic and is making humans that "need it" useless.
youtube
AI Governance
2025-09-08T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyZMSTSwiVlaIi_TCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgynUnUd6bMRiqsJL-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxxX0Nnd5jZsKT7bk54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzucEc2uEoXs-EGNIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyzwk7E2ev5VLv-urh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxVhJhbWUINJpAAix54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgydzgGhUd1NpjKg5rR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugxn76bxUFHvsFd5UM14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx2cj-UrKkXcZ2T2jt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzIRPg6ds4IcRlYisR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]