Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i remember back in 2019 when a korean ai model made Freddie Mercury sing a song,…
ytc_UgzlbteXm…
G
THIS MIGHT LOOKS JOKE BUT SOME DAY SOON OUR COMING GENERATION GONNA LIVE IT LIKE…
ytc_UgzXN4ASz…
G
@brianmi40 So this is the issue.
"AI" as it is being used most often now, actua…
ytr_UgxAcioiW…
G
I just watched this man conversationally berate this AI for almost 20 minutes an…
ytc_UgwsQA-jf…
G
Je w Sanders, that's not the issue at all. Not even a little. The issue is the s…
ytc_UgxPWALbW…
G
Fino a quando la logica che guida l’attuale sistema tecnocratico é il profitto a…
ytc_Ugw45_PtD…
G
TLDR; Unfortunately... A lot of this is pretty much garbage. Is AI becoming a po…
ytc_UgwvxrmpU…
G
Interesting. Here's are my problem with AI: it's being sold and pushed into ever…
ytc_Ugwg5bEuX…
Comment
Lets be honest there has never been anything like this history not even close. We never had robots replace our positions anywhere in history and just because you are benefiting from it now doesn't mean things can't go very wrong. Theoretically these systems can disrupt brain development in biology and make all human beings stupid and reliant on these systems possibly dooming our species. The engineers of these programs can't even predict what will even happen when this model is 100% up and running. That to me is bad science and its fking with our lives all because we don't want China doing it first. So please don't act like you know what is going to happen when not even the AI experts know themselves if they can even control it.
youtube
AI Jobs
2025-06-23T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzMhGUvkq84PWNwyV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyVkd7g4WyOVimD3d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXTVllgbntB1RVy_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjFmnE2Q3XtoTugW54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfQJeBrEVzYHND5uB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx-mOd6-B-t1zIppj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzpaPITBcqCeq7HR4Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwieySB8uV8WKbJTmp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGg95WUEOPuZ3PD6B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzA5R05h4efz1ql48t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}]