Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Big mistakes here: Describing the function of an AI in terms of "I want" already implies consciousness. AI wouldn't become conscious intentionally. Without consciousness it doesn't have intention. Secondly, there is no reason a conscious AI would have to sleep. It wouldn't have the same biological mandates because it isn't even biological. I realize this video is mostly a joke, though, and I also don't think AI will ever be conscious, but mainly because AI doesn't have a soul.
youtube AI Moral Status 2023-07-10T13:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugyu8prcfBRIFrVtVXl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyfcbG23KLsgiXJV354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwjKRRGpjIxcuTi2Yl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxCpd_m3XBUDRvmJ7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgynekHfH4K1vEcZEud4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzaFycCVCiYq7antmB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzklfsNdj-cphiFMJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgwJ3kb9UT5-As7TKPJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyzK7jMB3WbupyDStF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwY_hz3EfX_i-AOy-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})