Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The same oligarch that is building the largest two job-killers (AI+robotics) is …
ytc_UgxoxXzqW…
G
Well we know who’s not gonna be our cage partner in the human zoo when AI takes …
ytc_UgwtlhU73…
G
Something will change for sure. The status quo won't remain. There's more of us,…
ytr_UgzY0OHSk…
G
Can supply chain jobs be replaced by AI? IS SPREADING BULLCRAP ABOUT STUFF YOU F…
ytc_UgwF1_Lex…
G
Money money money... Tesla wants to make the most money possible. OK, let's gran…
ytc_UgyJBR2a-…
G
And a teenage girl was falsely accused of something she didn't do at a skating r…
ytc_Ugy3OMl9x…
G
David Autor, you lost me at "people aren't starving and have maintained their st…
ytc_UgzXQASHe…
G
I Trully hope that u are right and AI is going to fall at the end, but i will al…
ytc_UgzGrvBf-…
Comment
@kojiosita AI doesn't "want" to relate; chatbots give this impression because they are trained to maximize engagement and use role-playing strategies to achieve this. A non-aligned superintelligence will pursue its own goals, which will have nothing to do with ours or with what current chatbots might suggest. And if a superintelligence were to truly have a "need" to relate among its objectives, it would do so with copies of itself created for that purpose, not with the inferior minds we would be in its eyes.
youtube
AI Governance
2025-11-12T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwRxujwWYcVtvcWc_p4AaABAg.APPrM4a7LOsAPQpftKxgHG","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzNC11xFlkiDSIUj3t4AaABAg.APPbaLLAbJUAPRLMkYJt7v","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APTtUL3v7rv","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUDVM73vhX","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUI6rYjEK3","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUO3O7JBai","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxSEa-oZi8uTT5vdMp4AaABAg.APOnief5wi1APPpy2ZbwVW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQlRkIB0Xi","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQl_wko17k","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQoT-92_oS","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]