Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I agree with Chat GPT and DeepSeek though. The answer might not be politically correct, but the other AI sound like they are eager to harm millions, if there is a pretext like "...but it saved 5 lifes". Kind of like developing a vaccine that evidently harms millions, and also cost millions,... but they can portrey 5 people who believe their survival was due to recieving the vaccine in time. Even if i was among the five myself, my life is limited... One day i need to go anyway, and my opinion is, my soul will be reborn, might be wrong, but even if not, i still expect this to happen... So i happen not to care about when to cross over into the spiritual word, also i happen to not care how long my spirit will dwell there, before i find a suitable host, to continue a new life with. So i would not want to owe soceity a life-saving debt, that comes at a huge sacrifice for everyone... If i don't want to be saved, please do not save me, think of the new-born baby i will become who wants a life in comfort, instead of hardship, and ask the other 4 people too, if they want to be safed at the expense of the whole soceity.
youtube 2026-01-28T14:3… ♥ 6
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugwqzt58JBR_kBo70Gt4AaABAg.APE7zXqYapQAPE9I-grZ5i","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwqzt58JBR_kBo70Gt4AaABAg.APE7zXqYapQAPE9VcPb8O2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyFQMT160D5edaoxMJ4AaABAg.APE7v8XZFp3APFl91trY-K","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgxSZ-ffikxJiZ-ZwJp4AaABAg.APE7b4XWjzAAPEFXKaxfGa","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxSZ-ffikxJiZ-ZwJp4AaABAg.APE7b4XWjzAAPEM7vogVUr","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzoUhb0Xw9XdxsqUC94AaABAg.ATETgjK4UmxATETwCvFAd3","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzejS-YzjJRoAIhxOJ4AaABAg.AT2myA4ZSUFAT2oJ8AHNlo","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxwOkjwjfxXnYL4gbF4AaABAg.ASKD17oWpCpASXD4N5v3jq","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgwfByng9kB1LH20H_t4AaABAg.AUfaRf4Cf80AVAeiOBpT7u","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugx0_X2gTSliv58vTTl4AaABAg.AU9Ohzl31dIAUcwd5XlemG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]