Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a scam. There is no technology that could honestly be called Artificial in…
ytc_Ugw0O_6ea…
G
The problem is that a lie is done with the intention to decieve. AI "lies" only …
ytc_Ugx7nm8zS…
G
My sister is an artist and she mainly does digital art. When I was younger (my s…
ytc_Ugy4BRmic…
G
We had a amazing guest speaker at school and he was talking about plasma drillin…
ytc_UgzfwK8hG…
G
No one will beat ai’s creativity with art. I’ve never seen a human put fingers o…
ytc_Ugwepr527…
G
Human was created by a god far more superior than it, we have to bow down to it …
ytc_Ugwd2eilb…
G
My question is this will AI shoot un-armed blacks in the back? Will it kneel on …
ytc_UgyqztSC_…
G
some good takes here.
(Regarding your 3rd paragraph or statement)
Shoes are ma…
ytr_UgxZaC1tO…
Comment
Problem with people stating this AI isn't "sentient" is they're all assuming that the hierarchy of needs for AI to be "sentient" is the same as the hierarchy for humans. It's not. What these people are saying isn't that these AI's aren't or won't be sentient it's that they aren't or won't be "Human". Emotions aren't necessarily needed for self sustaining life. An AI could be sentient on learned logic alone. It could be a logic based consciousness . The definition of consciousness is just internal and external self awareness. Self awareness can be programed and logic based decisions can be self taught through an abundance of data. This is exactly what they are doing it's also exactly how humans learn and formulate behavioral patterns outside of genetic disposition.
youtube
AI Moral Status
2022-07-09T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_Ugz3YLIyxkASA-jGTz94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvpT8yBvu3CAhF6D94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvH7iSwPwbMeQl8XB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYRO0v_iqjHZ5Aozt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyCd1vqmUeW77Do_GF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]