Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sora ai : insted of making art their own they just steal art from others :/…
ytc_UgxJ8rqvz…
G
As pessoas criam robôs na esperança que eles tenham uma "vida" melhor que a vida…
ytc_Ugxjt8SOv…
G
DonaFKO, it sounds like you know a lot about how AI programming works. Thanks f…
ytr_UgyLoVQ4S…
G
I can understand why it might come off that way! The interaction between humans …
ytr_Ugy4DZyJf…
G
Just proves that every tesla robot we've seen as never actually been acting auto…
ytc_UgwAE-CMh…
G
Ai gives codes, but does not guarantee bug free. CEOs and VPs don’t care sinc…
ytc_UgybzKZQN…
G
Meanwhile, humans are killing 2 per hour and nobody says a word. Artificial int…
ytr_UgxxJqaly…
G
May Ai eliminate the dysfunction in our society. Too many public safety issues. …
ytc_Ugym-2u-B…
Comment
You still need them for their legal ability to take responsibility.
Besides this madness can't last. Someone will launch Golden Eye to stop it when automated hacker atracks go completely out of hand. It's no fiction, electronics actually do get fried like in the movie.
youtube
AI Governance
2026-02-25T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwu5ap18Kx4RtVK_Rd4AaABAg.ATbrQa8zq7bAUGZIGsjH7n","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwu5ap18Kx4RtVK_Rd4AaABAg.ATbrQa8zq7bAUG_G956YFU","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugx67CqIzYBN_7BtSrJ4AaABAg.ATbPVbH3twrATcxPXejVS2","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugx1TPAuC5cZ2QNZY4p4AaABAg.ARUKl9dE15KAS2zqLc8pkZ","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugx1TPAuC5cZ2QNZY4p4AaABAg.ARUKl9dE15KAS4jSCmJOnU","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzS3RD618mmsfMtLdB4AaABAg.ARTkjQIKkMiARdNuEAoRbE","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgzS3RD618mmsfMtLdB4AaABAg.ARTkjQIKkMiARdh55Hh9DC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw1AkmV7ctF7jLgQoJ4AaABAg.AQpHRT005VgAQq-QmDYnWo","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx2C7Z9rgdLSHiI0O14AaABAg.APf4V73gC9VAPgdK8auaSD","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx2C7Z9rgdLSHiI0O14AaABAg.APf4V73gC9VAPhITKHdlY_","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]