Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a bible believing christian I already have an end time prophecy that makes much more sense than a hollywood movie. This present corrupt world will not end like that. Humanity will not be exterminated by autonomous AI, and it probably won't be a major factor in any large scale war either. But in the mean time, I don't see any prophetical objection to the possiblity that armed autonomous drones end up being deployed in some areas, and causing a good deal of grief. The idea just turns my stomach with disgust and anger towards the idiots that are working on it. As for why I think that it won't achieve Terminator levels? Well, the whole conflict involves armies of men dying and killing, and at some point it's mentioned that the world would be in darkness and that they would fight on horseback and so on. This would seem to suggest that eventually the world loses the abitlity to use technology, perhaps through one of these dreaded solar storms knocking out all electronics, or maybe something more supernatural. Whatever the case, deploying autonomous armed drones in the world is just an awful idea no matter how you look at it, and I feel little more than contempt for the profession of those involved in that unholy endeavour.
youtube 2019-06-19T01:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugz1b6hJ0hdS_gdTq4F4AaABAg.AHFbF0naRvIAVpguBoO_Dn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwOAORBF8dUCx4ef414AaABAg.9MP28cxP4B_9MTd2tusi9c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyV6zwPqQ8y6rb3qvt4AaABAg.9EUhZHpZcN-9F-9tsmT00V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyV6zwPqQ8y6rb3qvt4AaABAg.9EUhZHpZcN-9W5uQrH75SR","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugxsmq36x96mhb7CJpp4AaABAg.943UPW9uBpU9EVZwUduCn5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz6tMO1HuQijvri-NV4AaABAg.8wJP4qeVUkG8wOw1YcEDy_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzQoiHSIs4eldEWBpp4AaABAg.8vsvTxR-Fdx8w8RgO2bD0S","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyyYfDnFH8lVL1HuFx4AaABAg.8vbnSRjLPqU8wLN3ocb7W9","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwJD2lznMWMukhxokd4AaABAg.8vOPkOUUIJ88vt5Bb61tNV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJD2lznMWMukhxokd4AaABAg.8vOPkOUUIJ89A5j7pp8jFs","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]