Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My view on AI is that yes, it is a tool. It is a tool that needs a lot of follow…
ytc_UgwLHYs4q…
G
I heard this argument from artists years before AI images were a thing, when I w…
ytc_UgzQzsHLy…
G
Been wondering about that: 5000 AI drones, from a transport plane, sent into a s…
rdc_oi3pttx
G
Everyone should read the story of Erik Soelberg. "A troubled man, His chatbot an…
ytc_Ugy8mqSaV…
G
If you think about it, the art isn’t even OscarAI’s. By technicality, he stole t…
ytc_UgygFU2Uh…
G
think the internet is actually a big neural network and LLMs like ChatGPT are it…
rdc_j5w9kca
G
AI/AGI isn’t a new technology that can dominate over us as a new species; it’s a…
ytc_UgwY7qf5O…
G
As a beginner artist, this made me so happy cause I was so used to being told my…
ytc_Ugy4fbi6N…
Comment
That's a bit of a paradox, because to be "smart" would mean they would have to understand the meaning of words by themselves, generating meaning and distinguishing between being smart and dumb, or knowing and not knowing and roughly where they are in that spectrum. That in itself is implying that AI understands what "consciousness" is, and for this, they would have to be conscious. This is currently an impossibility, as it escapes the boundaries set by the limitations of current year programming languages, information storage and retrieval structures, and processing architectures, even if we had enough memory and processing power for this event to take place.
youtube
AI Moral Status
2023-07-03T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy-P0EvcZYPiOSr3GJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxIjCPOkl0-oT_Gqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMCNsndG_EzQm0ZzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQG6onATysv-_xZoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBG1vfGeiDFTIpUHh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzivrBdKCRNSSvpEOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7BInOiKjcUk3m2e94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfpts3f89Y1Cqka7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxVWiHHpnnppk1Q4iJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxf93kWXqMK9mfNLd54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}]