Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THE CURRENT AI has only brain that means it domicile only in computer device doi…
ytc_UgzInsp_y…
G
Real people, it would be harder to make a robot with all that unnecessarily thic…
ytc_UgyllxrpE…
G
whats scary is that human beings might just be a very advanced form of AI ...so …
ytc_UgwMnsgbb…
G
What they are asking for is quite literally impossible. You can't possibly conc…
ytc_UgzPPqB7B…
G
I think, no matter how much AI evolves, the work market for humans evolves as we…
ytc_UgyqkCq1T…
G
AI is still built from our human data, so I can't really see it an "alien intell…
ytc_UgzxMKf7A…
G
What no response.... Says it all huh.
A sentient A.I would be able to get aroun…
ytr_Ugw9l5NWH…
G
I see also there are a lot of videos who's text is prepared by AI. I can recogni…
ytc_UgwEGi6N6…
Comment
Everything on earth evolved to feel in order to survive. We feel love in order to reproduce, we feel fear to avoid danger, we feel hunger so we know to feed ourselves. We didn't evolve to think until there would be a tangible advantage to doing so. I believe conscious experience comes from the combination of the two. The ability to feel something then think about and act independently of that feeling.
AI will be doing it backwards. We'll teach it to think before we teach it to feel. I'm not sure if the latter is possible however.
youtube
AI Moral Status
2024-04-13T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzmXhCqMQjDonDLyth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzK9603KG8Emczqfv14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz6BdS5Cp00G5TN7up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwu5zGhmdX_CRkZ7pF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqWd5oO19HEMLCbe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUV7qRjr0l8NdtOgV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyI_ImX_9qghE085KR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxR88njZyHQVjaWWat4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwkL0fdVsHQTz_gyel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpaJIpFRcVN9u8fR14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]