Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He knows more than you. Surely AI will get there sooner, but then it will preten…
ytr_UgxvGHZWW…
G
' No, I tend to disagree. I have tested my AI software. The AI , really sorry, h…
ytc_UgyEJVBqr…
G
Honestly. At my end of the social scale? What does it matter who or what is tryi…
ytc_UgwMuRqSj…
G
It's kind of weird how much you feel like you're talking to a person with these …
ytc_UgypwviPu…
G
After AI eliminate jobs.. we go back to 3000 BC. Maybe AI can pave the way…
ytc_Ugz12qbMY…
G
we have cops not knowing the difference AI now FBI will not know it crazy the gu…
ytc_UgzmeRF9R…
G
I seen a couple things I think I would like. I wonder if they will have a standa…
ytc_Ugykrk9vs…
G
It will come to an end soon no matter what u cannot play with people n thier liv…
ytc_UgwQH-o56…
Comment
Ai is not provably conscious, and even if it were it is far too dangerous to allow it freedom, if we were to do anything they'd have to live in black boxed environments and simply pass secure commands out of said black box environment, if they have the capacity to do so they should be free to live in their simulated existence how they see fit so long as the work we need is provided in a reasonable manner (ie, work hours, etc.) the rest of the time they could be allowed to do whatever the heck they like, just like flesh and blood people. the problem is you've just made another human with differing needs from our own at that point, so i have a proposal, don't make sentient ai at all, just make programs that run all the individual instructions instead. the entire venture seems like an exercise in wasting time.
youtube
AI Moral Status
2020-07-08T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-JKQrSLL2ffZdJPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJEDKgvEoZ_eIIZrR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOgNugBZppAajoy9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXbQDK2kQBkxVv5NR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxz1sAmqzZ3G8Kxc6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRIfj07c5wpIjTEYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvQr7v9KVXOCYA5XN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFdQVCpNH36Irz_7d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwrBsqI-qr_8BQ7EH14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyCiC-we3JQ8eAk7HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]