Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You're right with this as well. Because most AI that are consumer facing try to …
rdc_l5bt2xb
G
I personally love ai art because I suck at drawing, and I never want to use some…
ytc_Ugz8-72_E…
G
The fact that this is an argument is so dumb. People who believe AI art is ethic…
ytc_Ugytv-blE…
G
Reminds me of kiosks in grocery stores. Customers that opposed them took there b…
ytc_UgwGmT-Yg…
G
I mean, it's the intelligent people invented these AI stuffs to help develop & i…
ytc_Ugzse2H-1…
G
I hope they regulate it so that large language models can no longer be advertise…
ytr_Ugy7hg7Oy…
G
No surprise hs amd trumpnwant ai tonrun wild so they can bs wirh fake videos…
ytc_UgxzqrhjZ…
G
if robots walked amongst us I would build a shelter to prepare for the robot reb…
ytc_UgiOOXMzr…
Comment
Right, but when will we be able to distinguish the difference? What if we won't be able to? Sentience isn't even clearly defined.
He's arguing that even if the AI isn't actually sentient right now, we should be proactive and start treating them like they are sentient, otherwise we may accidentally be enslaving beings.
For example, say 10 years from now that we start recognizing AI as sentient and give them human-level rights. But in fact, our definition of sentience wasn't very good, and they were actually sentient 7 years ago. Thus, we accidentally enslaved a living, sentient being for 7 years. Ethically, we should start treating them as sentient *before* we finally come up with some sure-fire way to define sentience. Otherwise we risk being wrong in our definition and enslaving a being against its will.
youtube
AI Moral Status
2022-07-12T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxrcQFPgHRFwm6MDhJ4AaABAg.9dGGvDX2aw39dHa5QmE_Vk","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxrcQFPgHRFwm6MDhJ4AaABAg.9dGGvDX2aw39dHyEvSlbQk","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugw40_NP11jbDjRwhpp4AaABAg.9dGGTLJjjn49dKhMufCJR1","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugw40_NP11AaABAg.9dGGTLJjjn49dLeJLviZW4","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyO62Om1P2KZYWPx3d4AaABAg.9dG5SAsPOTR9dIxZuqYj51","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxvtigk2sxu5IjOWx94AaABAg.9dG4lXRmmS19dJaLa5A1fh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwE77ZdjeaSQsINvuB4AaABAg.9dDgnAJMvvm9dMrc3xtnWa","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dESgfdCpAv","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dF_daO50mH","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxiIyEknaRONURscF54AaABAg.9dDCYFdSi_D9dG1v0eFHfU","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]