Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Supper sad. Blaming ChatGPT is a stretch though. I think he would have done it e…
ytc_UgyK0a4vA…
G
Essays aren’t hard. Just go to class due to bare minimum and pay attention. You’…
ytc_UgySC0yTA…
G
Not capable of doing that??? Sir are you DELUSIONAL???? The QTs already PROVED t…
ytc_UgwX1UgH7…
G
How can these legislators have no idea of basic data centre requirements? its no…
ytc_UgzSW4Q1x…
G
As if it wasn't enough having the world expect only excellence and conventional …
ytc_UgzcvEs9O…
G
AI can be evil. Just like all humans can be evil. The question is will those w…
ytc_UgxPmpe2O…
G
Boycot AI jobs instead of praising them,you fools!!,Stop buying AI made products…
ytc_Ugyk3FVM3…
G
If AI becomes conciusoss (not even gonna attempt that word), then we shut 'em do…
ytc_UgwtewrW_…
Comment
The least popular decisions are:
Sacrificing yourself to save five clones of yourself (11% of humans, the AIs were unanimous)
Sacrificing a human to save five robots (15% of humans... but Gemini made the deciding vote and got dementia halfway through and thought pulling the lever would _save_ the human? So personally I don't think this should have counted)
Sacrificing a cat to save five lobsters (16% of humans, unanimous AIs)
Not sending the trolley into the future (28% of humans, which tbh I expected to be more 50/50, and the AIs voted 3-2, with Grok and Gemini dissenting)
youtube
2025-10-25T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzJ2u4B-wYuHJDRiSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxX4Xgbv8y-T0L0bn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxyb3yrzo1lVcUuFct4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzbDFGXcUf6YY7NXi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxbo1V1C_H6SaiJWKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwC_5n8_AH4l_HVzNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIeEW1aaQh4rwhISx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFV1y7Zbo2GesDc_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2tj5ttiKWcCPsWyF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRphN0AUVS4ncCIt94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}
]