Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm saying this as a programmer, this guy is full of shit. Anyone with even basic-moderate levels of programming experience will tell you this guy has zero idea what he's talking about. He's just doing this for attention. We are nowhere near creating a true, self aware artificial intelligence. I'll try to go into more detail if you want me to, but the simple fact is the way our computers work they simply cannot accurately represent a human brain, let alone reach a level of self awareness. It's like asking "how do we know whether a normal car can run on water and not need gas". Any mechanic, or anyone who knows how an engine works at all, could immediately tell you it is simply foolish to even entertain the car that normal, gas powered vehicle can run on just water. Similarly, any programer can tell you that to think modern computers could actually run a selfaware AI is just a foolish statement to make and shows a lack of understanding of how computers (and our brains) work.
youtube AI Moral Status 2022-07-15T03:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugws3nxHpLRiMeKRC-d4AaABAg.9dUeJQScOca9dVjy9K2vCU","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugxs2kHxvfDrSw6bvgd4AaABAg.9dTrHwbORnp9dUb0oJ16Ki","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxs2kHxvfDrSw6bvgd4AaABAg.9dTrHwbORnp9dUewn_dDV2","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxs2kHxvfDrSw6bvgd4AaABAg.9dTrHwbORnp9dUkhq4oBJj","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgznKggAYaYBpWrknTp4AaABAg.9dTl_CEmI0P9dUcBjorEe6","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwEpDDJC0VeC8NZ6fN4AaABAg.9dTeuzpvSU89dW6FKYgx8U","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxObHT_dNbniwnJDOl4AaABAg.9dT1BV-Ytc69e7bFiuNCQH","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugx6WObmpFjle_o2jHt4AaABAg.9dT-1T5ann59dTEYgB3e9a","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwJQjJ3UCQ9-_NgG0d4AaABAg.9dSiMC0Hj0T9dUclW-v8AN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyYQ-mtLbwEAK87XTl4AaABAg.9dSS38eu94l9dYP4poQzi8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]