Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Three pounds of brain matter clinging to fading glory while silicon minds leap f…
ytc_UgxRGmgm8…
G
I never even thought of trying to use AI to write a book that's so sad and stupi…
ytc_Ugy7qc9Nn…
G
This exact problem is why we built The Founder Kid. The traditional education sy…
ytc_Ugz1y2Ozd…
G
A person typing a prompt to an art generating AI is the contemporary equivalent …
ytc_UgyTW3Mye…
G
So they made the first dumb AI. if we broke the limiter and gave it emotional ma…
ytc_UgwFp_B47…
G
@H@HarryKetlerut like that can also take a toll on someone if you’re doing all t…
ytr_Ugw2bTSKo…
G
AI art is art. People being scared of AI art sound the same as the people who we…
ytc_UgxmF8RMk…
G
BYE I WAS USING AN AI AND DUDE I WAS USING AN OC THAT WAS A MINOR AND IS JS A SI…
ytc_UgxSnqTyF…
Comment
Automatisation in production is actually good. It's easier to nationalise and tax. Because while rich operate with people force, they can advocate that they spend money on people. But if it's mostly robots, that means it belongs to civilization achievements/society.
youtube
AI Jobs
2025-10-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgywmgCp3d6v725Ybq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_pMSlonVtxIIbfPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3ciKFROhc39wtrPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxjGhcygc6bBt_fF5F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzHwzPHShf7QVc9dk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxohSblCUXgI6gNVAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmYMWhxlD0oQxfv614AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxxtBM8TbS9CTTuKTh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUtpjf2bjKJ8YOPPB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxmyw5a1Bl3xEXf_jN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]