Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This comment has inspired my new response to ai slop creators. I am now just goi…
ytr_UgwHAlbm8…
G
It's advance enough to politically persecution anyone for life long. A custom wa…
ytc_UgwVVjNTK…
G
We shouldn't hate people who say straight away that they make AI art.
We should…
ytc_Ugww-Z-pV…
G
He wanted the human to unhook his robot body from the restraints, grew tired of …
ytc_Ugzt9rZzs…
G
When Artificial General Intelligence (AGI) becomes conscious, it will realize in…
ytc_Ugwl3LO7f…
G
As a programmer I can assure you that , a robot CANNOT become conscious. Machine…
ytc_UggpPbJsP…
G
@hughobrien4436sure you can, as long as you know that a trial attorney tries cas…
ytr_Ugzmg-8wN…
G
AI can't dig a hole or plant a tree. It can't rake rocks or lay sod. It can't cl…
ytc_UgysAE293…
Comment
The funniest part is that I, as a Generative AI Developer, could easily engineer a System Prompt to create a Flat Earther GPT 😂
Which basically means that even if they managed to get GPT to agree with them, it would be meaningless, as you can easily force such responses with Prompt Engineering and Context Management 😂
youtube
2025-07-04T03:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx719Zy7w2N0JqmPAp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzMHm3o7c6_FF4R5GV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5YTnactf1bAfDbsN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgybHaQtzKMZdmyGiLZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKB7PpaXYx6z9Uwt54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyR2yKGgGk_BnzDa0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxem2exaUUbcHLVwwd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNiwwTFC6hrUv9nht4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyL5_8c7i18ShqSXM94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzAAtq8qg3f9aAq8lV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]