Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why don't people have sense enough to just let AI do the work while people reap …
ytc_UgzHxrtR_…
G
@57kwestYup you're right there are so many areas like that where no AI will rep…
ytr_UgyV-PGfj…
G
ok, but why would we program torture to get robots to work for us? It's not like…
ytc_UgyLyWoMM…
G
I have a lot of respect for this guy, but he lost me in the last few seconds on …
ytc_UgxQt1z4w…
G
I think that ai can be a useful tool when it comes to programming but it really …
ytc_UgxLJubHP…
G
he asked chatgpt to take the pali side
the pali side is never about historical f…
ytr_Ugx2yIMDI…
G
Assuming that true FREE WILL actions can autonomously emerge from computer code …
ytc_Ugx_zh-eC…
G
I think even with an "ethical AI model", it kinda subverts the underlying purpos…
ytc_Ugx55lEgM…
Comment
I think that people can have purpose without having a job. We can have more time to solve problems we haven't been able to address. We can create. To have the time and not have to worry about housing? Can you think of all the things we could learn?
My job was already replaced by AI so I'm not all about how humans plan to use it. BUT giving people a stable life and not put them in jobs that hurt them. OH and think about all the disabled people who don't get disability bc of whatever reason, but they would be able to pay for somewhere to live. Dignity.
We can have purpose without have a job. I'm sure many people can and would love to. Boredom fuels creativity.
youtube
AI Governance
2026-03-23T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzw1vzvWsRBavFKlIp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxibkyAt2eRuLJoiyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC8kt137g31Q0HrTt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzntsYQZmSV07Svc5l4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrV4cL3vQ226tHz9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwE2ehfoSxI9NIl8Lt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRySxgpnZLNXmumap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzANA_xvGFDsoWFilF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVFORpOs7xzh_rcf14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzqSWWQIRjvatISbsZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]