Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@SimplilearnOfficial No there is
Koda, is a robot dog with artificial intelligen…
ytr_UgwHjwOx9…
G
I don't think that super-inteligence will try to get rid of humans. First point …
ytc_Ugy2wyWed…
G
@ad3l547 it can be both. Talent isn't a requirement but it can be there. It can…
ytr_Ugy0JMGft…
G
@ you have a point. in fact, AI rises fast. too fast. the Sora AI can make 3D vi…
ytr_Ugz6MyvpD…
G
@UnemployedStormtrooper ya and that's the kind of basic bitch thinking I'm accu…
ytr_UgwBpKTMk…
G
AI plumber? AI orchestra member? AI surgeon? AI dentist? AI plane pilot? In how …
ytc_UgznoJXJN…
G
I watched video where AI showed a guy how to carry out a terrorist attack. It to…
ytc_UgxMXc77W…
G
I bet AI can do that by now.
The problem is it learns very, very, quickly.…
ytc_UgyfY36aL…
Comment
@msqway11Exactly. It seems that a thing has to possess subjectivity to be conscious and, by extension, possess intelligence. And there seems to be an element of being-in-the-world and being-of-the-world that makes subjectivity possible.
We also don't even know what lifeforms do or do not possess awareness/subjectivity. For all we know, a parameceum is not some kind of automata-esque zombie "object" simply appearing in the awareness of subjects but a subject in its own right, experiencing its own qualitative life. Crazy stuff. Sorry for the digression, but I think what I'm trying to get at is:
A) We appear to be far too ignorant as a species to be in any position to "make" intelligent "life"
and
B) We don't even really know what Life or Being are or how to define them but we seem to know enough to trust that rigging together a bunch of component parts and running a current through them does NOT a life make.
youtube
Cross-Cultural
2025-06-29T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw8zZhO0PAWVZzshVZ4AaABAg.AJy1ByQrb4QAK0oUNpkKBc","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwNKxIgGNfN92Yqzdp4AaABAg.AJy-vdHsoK-AJzx9rcxAkL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwM2_DLwCYvoUoy7kt4AaABAg.AJy-OHPgaIwAJzOXvlGUvX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxGll7S1sLcqWUDzIV4AaABAg.AJxy6WO02aCAJzLKiJl5Bq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxGll7S1sLcqWUDzIV4AaABAg.AJxy6WO02aCAJzMm311PTd","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgxGll7S1sLcqWUDzIV4AaABAg.AJxy6WO02aCAK-pC8avMHk","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxGll7S1sLcqWUDzIV4AaABAg.AJxy6WO02aCAK-yl1R4w5i","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxRSMEyqh9oadRF6ax4AaABAg.AJxwcF2MxGjAJyEG_Gzveo","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxRSMEyqh9oadRF6ax4AaABAg.AJxwcF2MxGjAJyJshf9kYQ","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgzdHsWrC3roeihu_HF4AaABAg.AJxwRjf89wFAJxzdjyqdO4","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]