Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You kind of skimmed over why the open ai employees sided with Altman over the bo…
ytc_Ugz6oVjkT…
G
If I was an "ai" wouldn't this make me more likely to want to hurt humans?…
rdc_gd7xdwy
G
Everyone on my team uses AI for development now. It's definitely a highly valua…
rdc_n9toy39
G
why the fuck are we using AI to determine someone’s likelihood of committing a c…
ytc_UgzICUFdK…
G
Ai is a program that grabs information from across the internet and this just pr…
ytc_UgyUNyUSj…
G
I would want the same AI School Plan that Elon Musk uses for his Children!…
ytc_Ugx8D9JP9…
G
This post is five months old, hence information is out of date. Or could be AI c…
ytc_UgwHuKYnG…
G
This is a bit misleading. Cook County didn’t make its guaranteed-income program …
ytc_UgxzLefOB…
Comment
my hope is that one day Ai will be so advanced that I can give a prompt and it'll make a game for me. I'll be able to live out my dream ofmaking weird terrible games that perfectly suit my tastes. Imagine creating an ai generated minecraft mod, have the ai create a modpack for you while fixing any mod conflicts and also optimising the game so that potato computers can run it smoothly at 120 fps on max settings. I can't wait for the day they create sentient AI so I can have a robot girlfriend.
youtube
Viral AI Reaction
2025-06-18T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyakYMixfVoXn-rddh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwckZbUF4GxBF6Fmnp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2KXqkyKABqVCVJsR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMFNQ-t5ZjP-Zcf4p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpAnLH_-WqJ0H2NEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzji5rM75Uzoz8kzUN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwV4_Edv0WX8vuTvTB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTAN3F9k_gahI5JtJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyXXgRRvcN9syrwo2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxhbw9eFX9AGmU2gnh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]