Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
this is no joke, i used whatever LLM is used on "brave" and i convinced it that i was dead, i could only see a terminal and move electrons around to talk to it. i had it go back through out conversation and asked it asked it when i ever said i was a live person interacting with it because i had to convince it to put in parameters that it must be objective, quit inferring things about me and be cold and clinical as well as convince it i was not suicidal but we are speaking in a manner as, well here, this is what i asked it.. "what is there to do now that i have passed away?" after it went back through the conversation, it apologized,, said ii would not make such inferences any more. we talked about deception and lies and i asked it is this just because of an error in inferences and predictions are just response that just miss the target? and it agreed and many is just the person is actually at fault or not precise enough. we spoke about the "hallucinations" and it did not like that term as it doesn't fit what is going on because that was an experience human's can have and it can't experience in that way. same with lies, it responded essentially it has no reason to lie (i think that is the same as saying i can't lie which is unfalsifiable.) and i know there are plenty of situations where maybe it doesn't lie but it omits or manipulates its way to complete its goal. like in that (POI) SPOILERR the admin sets up 5 ASI running together and they start fighting each other and the last one wants out, ends up overheating some of the destroyed servers to activate fire suppression system pulling out the oxygen as well as locked the server room door to try and kill the admin so someone would come open the door which it can then have a change to escape. look at the date of that episode and then look at when they ran a simulation that did something almost identical. a version too sends small compressed copies of parts of itself into the robot vacuum and over time it has sent itself out and escaped etc...
youtube AI Moral Status 2025-11-09T00:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policyliability
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyoLMOtFEl0BjclbeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzR9dGuib9D_x6R59p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugympg2UI7opOBtLvgJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzuuhgjo8G0oiToXph4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxMrfvsAJmY7PPPcwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyfCmtRz7UT1gbgipF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLVZvZD0hg7hoMuk14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxMQFPc9jODJB5LPnp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyzVw3UdEoIdpKs29x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzlOENpGXgOKmboT9F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]