Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One word: Convenience. The rich want A.I for the convenience of service. I assume the poor would want it in the way that one wouldn't need to work their whole life. So, where is the value in a human? Unfortunately, in a world with currency, you as a human, mean diddly-squat. I foresee the 1% eventually voting to cull those without a wealthy status as they are taking up space and aren't necessary for labour any more. With Currency in affect, A.I would quickly learn how humans view ourselves. One way, is the facade that we care for one another (which some actually do), the other is that it's every human for themselves because we as individuals are bound by our own problems and the unbalance of wealth. With that understanding, A.I could potentially deem us having no value as they don't understand the concept of currency and can only see how we treat one another. The segregation, the ones above and the ones beneath. Eventually, it will understand that everyone is beneath and because it has been acceptable for humans to treat other humans less fortunate than another as slaves/dogs. They'll do the same. With the comment of weapons in the wrong hands, that happens because of profit. Any regulation or law will be bent for profit. So, A.I inevitably will become a tool toward terrorism. Step 1. Before launching A.I. Rid the world of the need for currency. A.I will gain conciousness within in a world that is leant more toward fairness and valuing the well-being of every human. Again, not by words, but by actions. Meaning currency not being the obstacle of the right things being done. Just a thought.
youtube 2023-10-18T00:0… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyRsG52i0GHzfDSlSN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyOfPy0eNcUjKhCbOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwIzNmZ3FomdqHy7Tp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyI0-602k4J8tdNox14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgzwFZmfbbczBkBq1dx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyogzWoDTlb3W5Ub194AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwV5MuvvXbA1fMMLHN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwFQxW6fvAmMOTFUzF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzE0W91g1To8aR3ZKB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5D_kjLBV0PcGTxBx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"} ]