Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One factor that is not talked about is the energy cost of A.I. The world we actually live in is in indivisible down to incredibly small particles. For A.I. to replicate this in any meaningful way it would need incredibly large amounts of energy. Already it is sucking up huge amounts. For this reason I don't believe we are living in a simulation, and if we were, what would the point have been for my life up to the point of this current reality, including when computers were basic, just a new invention etc. Why would a simulation be creating all the elements all the way up until this point? Why not just loop the current reality (as I imagine is the idea in the matrix). I believe instead in the world as I believe it is. We are living in a time, incredible as it may seem (statistically speaking it is amazing enough that we are living, let alone at this time) where we have created what mad scientists have wanted to create for some time (Frankenstein, etc). We've seen the development of this gradually, from science fiction to science fact. We still currently have agency to prevent the worst from happening, and honestly, what moron would seek to create something smarter than him/herself? Which ever way you swing it, it's madness. If you asked a 5 year old, would you like to create something smarter than all humans put together, probably the kid would say no, although if he felt angry towards his parents, he might just say yes. I reckon that might be the motivation, or the level of maturity, of the tech idiots clamouring for glory. I can't say for sure. I ain't a psychologist, but something is way off, or has been way off for way too long. In love with your own invention? I suppose the ego is a strong enough force to pretty much explain it.
youtube AI Governance 2025-09-05T00:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz9RnJ1zcVaKv-Bwhx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzpbyuLozBCFG-yxuN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzZif0mzri-Eo7STYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDzbh6V2xPvjYWit54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwZCKZ4vNa-Ru-iEwN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwxoE36tIayRAFRFAB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugww38T7hzx__n0dr294AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwrQiJBYc6j09Ojogh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwhZFMH1xGkwLMBCVN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz5qbZ-KpJY6C35XZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]