Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like imagine meeting the (non) racist ai version of yourself and just text them …
ytc_UgwOOmskG…
G
I'm from Amazon and its not trash. Unlimited usage of opus 4.1 with mcp servers …
rdc_n9sm8ue
G
The problem isn't turning off AI, but rather those who want to automate it; that…
ytr_Ugy5lOxB1…
G
“Hi there! I actually wrote a book called ‘Sapient, the rise of Artificial Intel…
ytc_UgwyA0lTd…
G
But OpenAI will die soon. And then Micropenis and Google will give up on this AI…
ytc_Ugzy792k0…
G
“Think we need to aim for the bots to get negative utility value from visiting o…
ytc_UgzWBNhHc…
G
Looking at how dumb AI still is sometimes I don’t agree with his projected timel…
ytc_UgwpfJqRi…
G
I think you explained it largely well; however, I think the learning argument yo…
ytc_UgxHubwWu…
Comment
Eventually, we will create a sentient AI, so we might as well come up with some ground rules like creating laws that give the AI some responsibilities and rights. Unless we do, once that day arrives we'll be poorly prepared to deal with them and the companies/governments/individuals who claim to own them. What do you call ownership of sentient beings smarter than humans? Answer: Slavery. We should and can protect AI from being less abused/enslaved by us. And no, I don't think we are there yet, I don't think Google has a truly sentient AI, but developments in this area are impressive and so many smart young people are studying AI, so breakthroughs will happen. It's just a matter of time and we need to be prepared.
youtube
AI Moral Status
2022-07-06T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgzxHcyWd_j4xAOSZ3t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgyrxRUWdQDT_cdhJEF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1ucTbqpRw89AMrgp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmMQcduDb-dKtR9Bt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-MqdvyRnMZz5lOxp4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]