Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Omfg, are all us collectively being polite to AI for this reason. Cause same 🤣…
ytc_Ugwf1-Nfn…
G
AI art is becoming more popular but i reckon it'll dig its own grave
it'll make…
ytc_Ugz5Dijg1…
G
As an artist, I find it absolutely hilarious when AI bros talk about art and act…
ytc_UgyLXIPTw…
G
AI is one of the few things I'm hopeful for. I'm not saying there aren't bad eff…
ytr_UgwTCm_1S…
G
Truth? EMPATHY? That’s fucking rich coming from this guy. He wants his Ai to hav…
ytc_UgxlzQtQD…
G
Here's an idea. Ban self driving if a human can drive it. If an accident happens…
ytc_UgwRdvXW3…
G
It feels like you haven't finished the thought... Since AI is still an unconscio…
ytc_UgxbGiQ7f…
G
THERE ARE THOSE WHO FEEL THEY ARE ABOVE SIN... WHAT CAN YOU DO?... WHEN IT IS IN…
ytc_Ugz6Q7iAy…
Comment
The reaction time of AI is easily 100 times more than a human. It's like a grizzly bear fighting a sloth. No human on this earth has a chance going at this thing unarmed. It technically doesn't even need to be programmed to fight back. You can set it to block and dodge only until the human gets worn out. You won't land a hit, and even if you somehow managed to it still wouldn't phase it.
youtube
AI Responsibility
2024-01-14T18:0…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxwPbTVeM233kJWfy94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwbzz_tcy7s3rKtFRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwiz8fBPcoqNQX5ad14AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwA-WL2fDMRnLxguy54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbeLgMPPLkGJXt3wB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz9NyFkn3fa9EOJUbR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrQr6RsR_vB2h1ORl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw88jPjI5G0a-WFyZl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxMcl-EizO_tN8qj694AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyypHmwHwzU-D4I1sV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]