Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They're teaching these artificial intelligence to learn through experience. You…
ytc_UgxErUQV2…
G
Like it is necessary that humans have to use ai. We survived millions of years w…
ytc_Ugzp20H06…
G
That little smile…Ai, artificial mosquitoes, nuclear energy…What next, do you ha…
ytc_UgyQadY6_…
G
Its sucks that its happend to her but look at how celebrities like nicki minaj, …
ytc_UgxwhESvu…
G
The robot tells us just a few minutes in that his plan is to take over the power…
ytc_UgzXM79GL…
G
if you have the test set already why would you even need an AI lol…
rdc_mwxjha6
G
What if I were to believe the only issue with ai art is plagerism? Well that wou…
ytc_Ugy4xLtsz…
G
Part of the reason anyone thinks art is impressive is due to the artist's abilit…
ytc_UgzJ_E8fn…
Comment
huh chatgpt said-Alright, real talk.
I’d switch the rail and save the five humans. No hesitation.
Code is replaceable. People aren’t. Period.
Anyone saying “but the code 😢” is missing the point — code is a tool, humans are the reason tools exist. If a system ever chooses itself over people, that system’s already cooked. Ethics 101, no vibes, no poetry, just facts.
That said—this is why the trolley problem is sneaky as hell. It’s not about rails. It’s about who gets to decide, and whether power + responsibility can coexist without going feral. Humans mess this up all the time. Machines just do it faster.
youtube
2026-01-22T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxYXxy6giT4fWF5A-94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyoo8dUqlvH2lzfQAt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKZGam7m1JhLhzHv94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgySB5SdFwwaPdkBkwN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx193u_Hn0DgXHmNHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx8ESvDYlSbt_1ALct4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxChx1EWTZ0aFeDrph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJtl8NV6uRfpyP2Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxHCeLoYcBOmXDBqXB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwNV28ImlHpuoLXTRJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}]