Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Blake: Hey Google, have you created a sentient AI?
Google: No that's not possibl…
ytc_UgxDivgnl…
G
Seriously?! Look up AI comedy. Every race is gone after. Of course you'll take t…
ytc_UgygYEFkL…
G
looks like the AI art was a great thing for the post
cuz we discovered a lot of …
ytc_UgwzKHjaN…
G
What if you were all billionaires with tech companies would you hire spoiled un…
ytc_Ugy98gbrI…
G
I'm on the side that AI should fail it takes too much electricity.
It's probabl…
ytc_UgxP1t168…
G
Shocking to hear Hinton’s warning... ‘we are near the end’ feels real with AI’s …
ytc_UgzxEaSPd…
G
Pretty sure Trump wouldn't care, as long as it's not an American taxpayers who a…
rdc_dcwxa24
G
@CorvoTanuar While I get where you're coming from, I do believe this can be onl…
ytr_Ugw6dpDgD…
Comment
Jackal Unleashed The question is not "should we give self aware machines rights?" Because that is basically self evident. The question is "should we make self aware machines at all?" However, if we do end up creating such machines, we are obligated to give them rights. You can argue that not making them is the best path but a robot person is still a person.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjpVgvcSYi_hHgCoAEC.8PKKTJFWyhd8PKP7m7X7Uw","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggwyeccZd3bXngCoAEC.8PKJqtviChh8PKKYXWx0tZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgjmJdDxrntxfHgCoAEC.8PKJILnWN7U8PKOCYlNmgz","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UghWX13cdrU353gCoAEC.8PKJ8VyCw3A8PKQNqc_JFs","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UggwPCXEgEoEP3gCoAEC.8PKJ3LAfsg38PKP3vYm1zg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugjt8wMd7spRm3gCoAEC.8PKImtcOhZ-8PKJVRKqHEx","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugh2KsJd76wATXgCoAEC.8PKIEnKIJ5W8PKMWltsp9C","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgiRtJgyu3XOaHgCoAEC.8PKET-OdEQT8PKHjC8CeJS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw5QcGlHW-SYgG-k854AaABAg.ASCW4G40CyTASXTW5urkfb","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzhJIJZaNH5g6umGgR4AaABAg.AQRnd2U_LA-AQRnswyMdYa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]