Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@madmax07ish Maybe you're right, also with our current binary way of programming, it's unlikely to get an unpredictable behavior, everything is strictly determined in the code. But now we have neural networks that's capable of writing its own code and quantum computing that's capable of exploring multiple posibilities at the same time and solving impossible problems for binary machines. So maybe a combination of both (plus internet) could create something that we can't control, maybe a spontaneous consciousness since we don't know what it is. If you're right, then it will be just a "simulated consciousness" without life and therefore without self-preservation instinct (assuming consciousness doesn't have that property). I think we are about to know that very soon anyways...
youtube AI Moral Status 2018-08-23T16:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzyHaleH5bR3zr63Ll4AaABAg.8me5UJ0AqRf8qLIQEAYcGG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzyHaleH5bR3zr63Ll4AaABAg.8me5UJ0AqRf8rqH8R3Qz77","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzSoECK7c4O7aX38Eh4AaABAg.8mDy-Mwu0zS8sTobDqSP2a","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzSoECK7c4O7aX38Eh4AaABAg.8mDy-Mwu0zS8sTp9lO2TWD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzvdM7bKD1ZyJwWLiN4AaABAg.8lsGROEfSOS8mLrszCLkcq","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzvdM7bKD1ZyJwWLiN4AaABAg.8lsGROEfSOS8pEHaPjy8Eo","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytr_UgzvdM7bKD1ZyJwWLiN4AaABAg.8lsGROEfSOS8qeHY6ObDO8","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzvdM7bKD1ZyJwWLiN4AaABAg.8lsGROEfSOS8rgeEMvVcCX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgyuolW_hiMqnTECIRZ4AaABAg.8lpP5-uCJB-8qB2ApZzxZx","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyoTnivxyrsFdclMCl4AaABAg.8k2yLx7jKJW8kIYUYiKKhM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]