Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Don't know about the rest of you, but I find this very concerning. What if some country decides to build and program computers, robots and androids to go to war or commit acts of terrorism against another country and the computer decides it better off without humanity. Yeah, just like in the terminator. AI. Look at what occurred with Zuckerberg and his colleagues creating two computers which decided to create their own language in what appears to be an attempt to keep humans out of their interactions. And as soon as it was discovered they shut the computers down. Throw in some WiFi, solar cells, rechargeable lithium batteries, who knows what's possible. Look at how computers interface and communicate instantly over long distances through all the electronic devices, even without us realizing they are doing it. They can control our power grids, our water and easily interrupt our ability to communicate. They can even shut down cars, fuel stations, banks easily by infecting systems with viruses and who knows what. Just sayin. Please Watch "New Robot Makes Soldiers Obsolete (Bosstown Dynamics)" on YouTube https://youtu.be/y3RIHnK0_NE LOL. Just imagine though....hmmm?
youtube AI Moral Status 2019-11-04T02:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxT0-S9fQf8PPUJh694AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwLu4wseWvolmz3Nx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyPy5FLxJ7W8TEoEhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxb6y8vG7LqXJkM28B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyXe7Ut43Ja7owVKo14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzDiIYXkHyb7d27Rxx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwUC5xJlQkGI5Ptpdh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy2wjZ6VAk0IggO9-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWbwe-iVhTmNHQKW14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyqJHzWdWQ1N6bh5mZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]