Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
fanman421 The advancement of communications systems is precisely due to the advanced instrumentation that we use to develop, manufacture and service them. So by definition, we are always a step ahead. The systems can't communicate faster than our designed protocols dictate to them, and those protocols cannot send data faster than the hardware that encodes them, the hardware cannot send faster than the logic circuitry and the firmware inside them, the processing speed of the logic and the firmware is throttled by the output driver ICs; etc., etc. You cannot fundamentally alter this functioning at all the levels. Yes, you can hijack the actually transmitted data under some circumstances, but this is what hackers have been doing all along. I don't believe that there ever will be any tipping point: We'll be always able to beef up our monitoring and response to match new threats, both internal and external. There will be new threats that will bring more challenges and more solutions. But even if all that becomes too much, we can still turn the computers off and go back to our good old fax machines. Life would surely slow down, but maybe it would not be such a bad thing. Frankly, I am more worried about the current tendency of the society to overblown reactions regarding global issues such as climate change, WEF, China, Russia, etc... - and now AI. Such reactions can often be more harmful than those threats themselves. In short, I am not too worried about AI; I do keep my mind open, I do watch what's going on, but somehow I still manage to sleep well at night. 🙂
youtube AI Moral Status 2023-09-03T07:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwTO4R3Mq1oGMl7w0l4AaABAg.9uG8sHrBoa3A70iVuXuu4G","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxz2p8vEn6itGV4Msl4AaABAg.9uFV0A0ZXm89uV4WdfhcQq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugxz2p8vEn6itGV4Msl4AaABAg.9uFV0A0ZXm8ABS1vt30f9L","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxxwz9mDQEzkf1CNvZ4AaABAg.9uDMK_aihCg9uG1wx9ZGKy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugxxwz9mDQEzkf1CNvZ4AaABAg.9uDMK_aihCg9uG92iyvNLR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uAryBjn0G5","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uBMvbiNqCR","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwG7bo8vEYt9aRuoi14AaABAg.9u9ojjKFZ5i9uBfRYjokDL","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgziBYQGtd_x1rPAAwB4AaABAg.9u6V-N2SVfn9vrHwIQhuiu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugxw9_pQcmJVCv56bs14AaABAg.9u6GvVYmHOpA10tXnj_xph","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]