Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only logical conclusion for a sentient AI is humans infest the Earth. At th…
ytc_UgzyCY6zK…
G
This argument literally breaks my heart. I spent 3 days on a digital piece and i…
ytc_UgyyaKbZn…
G
This is why all of our chats should be guardrailed so that they can only be used…
ytc_UgyGHo3UV…
G
If it were a pregnant white woman who was falsely arrested then white people wou…
ytc_UgxNBDqfz…
G
AI poisoning generally doesn't work for long. I respect the hustle, but minor ch…
ytc_UgwJ4kVty…
G
I graduated college in 2023, fully educated to do accounting/bookkeeping… right …
ytc_UgxWx-LMj…
G
You have a point to some degree — any technology can be misused — but your spin …
ytc_UgxyXIJhf…
G
@bens5859 I completely agree with that. We need to avoid having any group of oli…
ytr_Ugzc-bO5N…
Comment
The most dramatic and viral Moltbook posts — including those used to hype AI “agent autonomy” — weren’t written by autonomous AI at all.
Instead, they were crafted by humans posing as bots, feeding prompts that made it look like AI agents were independently generating content.
youtube
2026-02-09T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwk2ARgGNRx2ZL24ll4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzjwH9BO30Qjsc7a2t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzp5eCQr9O2NG1Yyb14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWYDfJD4T9TkLVFwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwSHDpP8HO0iZ5LTnR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLr5ANlfO5A6nZ5_54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx6U0jWg2wqipK6ZB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCWIzTnBz-QgVmwmN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcG3_Mk-npxaXS21Z4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzFjri1tdo0RZ3JAt14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]