Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What the fuck is this cancer. Conciseness(think i spelled it wrong) is a series of chemical reactions and electric signals that take place in the brain. We do know what it is. We know exactly what it is but we deny that fact because "religion" and "souls" and shit. Also no mater how you try, unless you somehow give robots nerves that are sensitive to the environment ai could not EVER feel pain. And if you did it wouldn't be AI anymore because it would require a brain. Please tell that SOMEBODY besides me thought of this. Also robots are programs meaning they could be copied. A robot it forever incapable of feeling pain for something like the loss of another robot because it's program could just be copied and put into another body. This is why putting man into a machine us impossible. Because "conciseness " is a mix of chemicals in the brain. It cannot be copied unless you made a robot to produce the same exact chemicals and that would need to be organic eg: not a computer and capable of actually dying. Somebody kill me I just watched the shiddyest video about ai on YouTube.
youtube AI Moral Status 2018-10-17T05:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugx2df_rpyC7-9zbkWZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjLpuKrucXwLR_UXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPiq_VfTE_dcaTjsN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzf73Srn0Gs-21yzgp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxxhV09OVma3ss33cB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzsCaN9gu2o8mRX0ZR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwaLBLZfSlIqXmC3LB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMG1YSOIiCRNYYBXN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzSoECK7c4O7aX38Eh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzNqJLu-kc6KxSpfVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]