Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sounds like we’re 3-5 years away from a major structural failure somewhere where…
rdc_nm0rshb
G
AI requires lots of data to be able to provide recommendations that you can be c…
ytc_Ugw68yhnT…
G
I’m sorry, but when she was like “Shut that off!” I just laughed so much.…
ytc_UgwiLqP0Q…
G
AI does not "have" knowledge. Humans have knowledge. AI has access to data repre…
ytc_UgwVT3eBw…
G
58:45 if AI can intentionally give us the wrong answers, is this where people ne…
ytc_UgzJPtvov…
G
I have a severe mental disability and the reason why I bought a Tesla was for th…
ytc_UgzS2weVY…
G
I really dont think ia is the reason he died, I think he was just suicidal, with…
ytc_Ugx5yHDlz…
G
Developers should learn 2 very important things from this scenario:
1. Don't dig…
ytc_Ugys_7Sty…
Comment
Since AI is built on our messy, imperfect world, it’s bound to blurt out a few non-truths now and then — the trick is to fix its little “oops” moments before they grow legs and run wild!
youtube
AI Governance
2025-08-22T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzxN5G-HEPZtFefR3B4AaABAg.AM8Wl9suTNGAMGvUYC7pIv","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugxx9qpGuLd2ykgLO_F4AaABAg.AM7sedQbyMIAM8MkMaCRmW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxx9qpGuLd2ykgLO_F4AaABAg.AM7sedQbyMIAM8U6B1Wrs-","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgzUCeUp19d7VNj4li54AaABAg.AM7r-yqK7sIAM9GSCPrC-L","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgxmJOtMC0JNnttbqRB4AaABAg.AJhUbwUumESANr5nAD7yaE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxe2d2JPYqAUiHFe854AaABAg.AG5k4KeGE9sAPo3930YHdr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPwbPKWkG7J43GV-R4AaABAg.AFmxL8-SEYZAG5zd7Gx9wB","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_Ugz6fw_yPs5y9_v87Qp4AaABAg.ACRye-XycaqADHF9AAflQf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy-isrs7pM_0H3TYNN4AaABAg.AB52ZnmZE_tAE7YQyMZnHV","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzFRMLJYKKj3rDmZOV4AaABAg.AVsbjQmcNf8AVsjey0y1xV","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]