Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The A.I. is NOT biased. It's calculating based on only mathematical FACTS and no…
ytc_Ugy1CUuob…
G
Interesting. I dont chat to AI in my usual day. Though I guess its a chat bot fo…
ytc_UgwQDRMdg…
G
The entire world needs to listen to what Mr. Hinton is saying. Most are just s…
ytc_UgxURtzPH…
G
The information provided by Ai can be great, very biased or very wrong. So it de…
ytc_Ugy_0CXex…
G
I think we should ask AI how to create an AI friendly to the human race.
Give me…
ytc_UgxaptS5F…
G
I think the premise in the OP is actually also wrong.
I would guess that UBI a…
rdc_oa0bj6w
G
You're absolutely right! The balance between efficiency and the human touch is c…
ytr_Ugw78IczR…
G
The rights of individuals have already been taken by parasite politicians alread…
ytc_UgxLaTYUO…
Comment
I disagree with your concept of treating agents like humans such as a compensation plan. Why aren't you paying your car, or paying your computer or paying your cell phone all of which may eventually have AI chips. The purpose of automation is to not treat machines like people. Next thing you know you'll be talking about AI unions. You need to STOP the personification of agents. If an agent's concept gets a patent then the owner of that agent owns the patent. I don't need to hear any comment about slavery. These are machines, just like the tools in one's workshop only 'intelligent.'
youtube
2026-03-15T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UghXRaaqYijNOngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UghE2IWDMBc7IXgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0tzyavApdlngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxuofp4hOfdcmGJvq54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzUqPg2BFsgWIbK1hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrC3cAAGZiBp6PpDN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxvWVHEFFoZcyAQ5Wh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOJWiASRnB7hdn2EF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzI0gxy0HOZLpzH0rB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyCfto3o8FSKdOqBrl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]