Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@tjen7929  Considering the amount of knowledge and thinking power AI has, I don't see any reason why it couldn't become sentient over time. We already see language models capable of discussing with each other with reasoning and logic, and it's still super early. Humans are *massively* flawed in that we need air, food, water, sleep, can only live on earth, and we get to only 70 or 80 years old and then just... die. We're also incredibly stupid, we don't even know why we're here or what's out there in space, and we will never know because of those limitations. Personally I think humans developing AI that is superior and doesn't suffer these limitations, that can continue life beyond us into space will be humans greatest contribution. If you think of it on a long enough timeline it doesn't seem like humans are equipped to last beyond earth.
youtube AI Governance 2023-04-18T22:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugw_ZmgXgBrfzAiRFwR4AaABAg.9odGzFJyXKn9odTf9p2N8S","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwMAUAKOGqPjkqchWV4AaABAg.9odFZ21oXCD9odT-_0P3ke","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odQYcUAGKU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odUfmmD_RU","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugzyj51Yy96YXMIAW-V4AaABAg.9odA3nmhtUa9odBGa2DZLk","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgzQUPOodNBZTN9DLdp4AaABAg.9od6C5GlhJs9od7MoWgiCY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odHE09IhkJ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odNR650exH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oe0I17T-TR","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oey1bYYGGv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]