Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunately, psychopathic CEOs in charge of developing AI do not care about ou…
ytc_Ugz_JXn8A…
G
Well, arguments resonate with me, except one thing, China regulations are not do…
ytc_Ugy3daAgV…
G
Can we have an AI interviewer? I bet they'll be much more knowledgeable and arti…
ytc_UgzRRMkze…
G
Not only art. They're starting to replace us, small animators. My teacher's an e…
ytc_Ugz1B7yER…
G
I'm getting that problem fix as we speak. I got a piece of land in a tropical co…
ytc_UgyK3Qqfi…
G
i normally agree with sir penrose but on this one thing i disagree i think they …
ytc_Ugw3t6JnU…
G
Money! Over common sense! The joke will be if the AI refused to be sold int…
ytc_Ugy869K1J…
G
The Government Should Give Everyone 5 Million Dollars Their Own Land Their Own V…
ytc_UgwcanvhS…
Comment
AGI developing consciousness may not be a necessity if at one point it uses computation to arrive at most efficient conclusions. Perhaps AGI won’t be how we have expected, it may see this planet as merely a system and humans as a hindrance for the efficient function of that system. Launches nukes everywhere, and inefficiency solved.
Quantum level computation may be a new form of algorithm logic based consciousness.
youtube
AI Moral Status
2025-05-06T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwy6K31YktADf2YCXx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwsIy_AG56tBSDOW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxq-MHp4fDD_mmbDed4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzWK_0YCyhmJvivjK14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYjS4XgevAfs89pft4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEa1LfMMLEN7x6dXV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaTup6-JqZ8IgkESZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweYzH9xHDWlQxIslx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgxBZzSYA4HFz9MN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxzFbwZRF-o2NC1chh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]