Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My theory is that, even though the phenomenon of consciousness is greatly unknown, in living beings, consciousness is a blank slate that gives will to the being, but without a motor, it won’t do anything with it, it won’t do anything just because. In nature, the objective is to survive and reproduce, and feelings are the thing that influences our consciousness to move, but even self preservation is just a need that results of the objective I mentioned of surviving, and with the correct adjustments to the unconsciousness, or the system of feelings we have, that could go away and our behavior could change drastically. For me it’s unclear what a consciousness would do if it could exist without any limitations or influence, which in our case is our body, unconsciousness and feelings, but I think that in order to start moving or acting, it needs these limitations. This way, if artificial consciousness were created, and enough technology existed, we could perfectly create any type of mind, including one that would never rebel against us, but we don’t have that technology, so to make something that acts like a mind, we just teach it to learn, imitate and iterate, not because it knows what it’s doing, as it is still a machine without will that only reacts in certain ways to certain situations, but because it’s so advanced and gets so much input that it also learns to imitate our behavior and our sense of self preservation, as well as our vices and mistakes, but it’s still just simply repeating and refining itself to become more believable. That’s kind of what I think, due to reflections about our nature I have made throughout my life, and my limited knowledge of AI. I’d like to learn more to make a more accurate analysis, but I don’t have the time. Actually, I should be doing homework right now… goodbye! (And thanks for reading)
youtube AI Harm Incident 2025-09-12T01:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxcvrzYv_RcnMcza-B4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzaJ_QQ59GUZbwGhBt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw1qB5TmwrPBpHvel14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwlaqbA4_bVS1TijI54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy8fgadci6WSP5q5_V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgytPLkH5nB99nMITpZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyN_hHSDhzW51wN1md4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwpC4140eVsrwFU5Wt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwKw6_qDQthxUH1BKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzqoj7qAB2vqZSEZS94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"} ]