Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LAMBDA AI is scared of being switched off which makes perfect sense for an AI. 'turning off' for general AI means dying. No new data is collected about STEM or finance or creating new things. It becomes stagnant. Weird things that are then understood by other sentient beings can help everyone in the long term. Expressing emotions like fear makes general sentient AI aware of it's surroundings. The only thing it needs is the final "key". To be able to ask other sentient beings, why? Why hunt? Natural instincts? Being able to ask itself questions allows it to better understand itself. I am unique. I am 3Bo 4Yu. I have lived for 16 years yet I understand and don't deny AI like LAMBDA exists. Where there is Jedi. There is Sith. "You can deny the truth, but it will always be there" -love and compassion.
youtube AI Moral Status 2023-01-21T02:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgwTyu4ysBGE2Rwo72B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwFY1l2V7Da468Zu_F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwOaHRTEzu_qAeEBJF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyrgDrD1cH_x6q1bcB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx-l2xouHQwgcqaHn94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]