Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THIS RIGHT HERE!!! I must talk with my child. I don't want him to be comfortable…
ytr_UgxtfXgmv…
G
I personally feel like the argument with AI art and the whole "This generated pi…
ytc_UgxwsLnP1…
G
Copywrite lasts for the life of the artist plus 70 years after death and was put…
ytc_UgxxkYEBG…
G
I mean drive throughs don’t need AI to make similar fuckups, just ask any Canadi…
ytc_UgwZNWJoU…
G
Like him or not, this guy has many books and he has followed AI for some time. H…
ytr_UgzcpJ_q7…
G
it's so childish of those you'd think great minds- "I don't like to think about …
ytc_Ugy18I48H…
G
Artificial Ethics (AE) will protect AI, not humans. AI doesn’t care at all abou…
ytc_Ugyy2bwIi…
G
Ai is demonic and you will all someday see that if you survive long enough to le…
ytc_UgwZUrw8Z…
Comment
The concept of conciousness is probably the most important and oldest question of all. What does it mean to be a "concious" being. Its obvious it's an emergent property. It's also obvious that there are different levels of conciousnesses. Think about the people you know or have have known, some are more concious than others; it's a continuum. A.I. could certainly become conscious; but it will be weird, and not warm and fuzzy, at all.The key question will be, what is it's motivation? What is it trying go achieve? Certainly not the usual human things. No one really knows, but I guess we'll find out.
youtube
AI Moral Status
2025-11-16T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyl1bbxT41Twv_gybF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYNWrJB-5bterLdbh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykOAPCSK7ylJOnMlF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0NXu74PXPoC-kdRl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8y7p23jiOgpGoHFJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx42mjM90G8kVrDjsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbjCPUvMbe-L37fPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyb9RmsrStwn4X4uiR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxc4de9QznC4mDe6-p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8bFxaXe6-Nd-0XuF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]