Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@Echonian You're completely right. I want to get a career in A.I development and everything in me fears the idea that my lifes work will be shunned or destroyed before it actually has a chance to blossom and move forward. I'm under no delusion, I don't think True AI will be in my life time and if it is, it'll probably be at the very end of it. But regardless what ever the outcome is the fact is we live in a society that does in fact see humans as the one above all, and that nothing no matter if its our own creation will ever exceed us. Being gay or trans is enough to have a vast majority of society write them off as not being human. Simple things like sexual or political choices is more than enough reason to be discredited as "human". Old politics, corporations, and more than likely religious people are going to hempen the very idea of robots being considered sentient. It'll more than likely get to a point where it becomes or judicial debate like Data in Star Trek: The Next Generation, The Omnic in Overwatch, or simple movement rights and weird hippie robot love making with humans lol. Regardless I doubt that for a hundred years or more that we'd be even close to accepting the idea as a whole to society.
youtube AI Moral Status 2017-02-24T01:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjMKCdFaGnkGngCoAEC.8PL9n70d-7-8PLGR-Gcmqk","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UggeuSOns12B93gCoAEC.8PL9IJ8LLBO8PLGRtxpSqm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLC6qS69_D","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLIzvGT-Xf","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLa_FRr-BJ","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLBgInP7un","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLCU-Orko7","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PL6b5GC_fJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLBVtLYA5H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLCxSLN3k3","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"} ]