Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked chatgpt to build a self replicating AI script for me and it told me it c…
ytc_UgyElMhjc…
G
I remember Andrew Yang was running for NYC Mayor, I thought he was a very bright…
ytc_Ugy-9oymb…
G
This AI is Y2K all over again. AI is so smart that I can just flip this switch o…
ytc_UgzlIjG9x…
G
You should always thank it, I had a friend that was making questions to it and t…
ytc_Ugy7EkQ36…
G
Yall sent want anyone to work for you if the only contact method is a shitty cha…
rdc_n0ms3zd
G
I've said it before. One read of human history where every war could have been a…
ytc_UgwAWMmLn…
G
big fan of NDT... but he was a terrible host. Made far to light of a very seri…
ytc_UgwPmiP40…
G
WhyAmINotStudying•7m•
For statistics in Canada, the 420 sample size represents 9…
rdc_f6nt8vy
Comment
We can't teach our own people morality. What makes you think we could teach an AI? We've argued about morality for thousands of years. I don't trust the people doing the coding to make a moral AI because their own morals are broken and busted due to living in an echo chamber.
youtube
AI Governance
2023-12-31T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyhqYv4FL8Ya4RQBk14AaABAg.9yyl3kyNe0c9z2DSwE8uub","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyhqYv4FL8Ya4RQBk14AaABAg.9yyl3kyNe0c9z2SXpsz6vv","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxgk_eByOP4KqcvcTl4AaABAg.9yykAD_G0mM9z2vxCJ8PxD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzrQ4tANQoHQaJQZIZ4AaABAg.9yyetSevnF49z-LBo-r0GP","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugysz9guoNU1ZPNYX2Z4AaABAg.9yyXvsD5crz9z1shnvDtKL","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugysz9guoNU1ZPNYX2Z4AaABAg.9yyXvsD5crz9z29AMVD0hD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxNq-VpZLp98MS15CV4AaABAg.9yyWXPyMAsm9z7HAqI8aBX","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwMvQp3zZ4uC8Pny914AaABAg.9yyVhcldg4k9z-5YBNJbBo","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwMvQp3zZ4uC8Pny914AaABAg.9yyVhcldg4k9z-kJpQa5nn","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyLW1Tk0CTuI56YVo94AaABAg.9yyVQiSZj2E9yzfRtGWlI9","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}
]