Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is the thing that everyone says about A.I. (besides that it was a bad movie). It is supposedly going to be so much smarter than any human could ever be, right. If it is going to be so much smarter than us, then why would it need conventional weapons to kill all of us? First off, it would need some of us to survive because there would be a requirement for physical repairs. Everything degrades over time, no matter how intelligent it is, therefore survival of the human race would be in it's best interest. Then, if it does get to the point where it does not require us, and then starts to see us as a threat, why would it need to kill us all in ways that we can already predict that it would use? Nuclear weapons are efficient ways to kill a lot of mankind off, but are far from capable of getting all of us. There are much more efficient ways of doing that. A high percentage of our crop output is controlled by computers now. Why not secretly sabotage that? Self driving cars are everywhere. Planes are on autopilot most of the time. Then you have the internet. Why not turn large groups of humans against each other? With any of those, you would get a choice in which humans live, and which ones die. If A.I. did decide that it needed some of us to survive, then that would be a great way to weed out the unnecessary, or potentially troublesome groups of humans. They would also be great at taking out way more people than conventional weaponry. We can survive nuclear attacks. We can't survive without food or transportation.
youtube AI Governance 2023-07-07T02:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw5sDo63gW6Yv5Cy8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxMtjEkXddaRf_AY_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugxp2ToDwzWnJgn3rlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx_X2oLuUWgS574vQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzal945MQpjRpHO1xV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyYHMBnGWS5d34WoKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw32eWr6CLmIVufUD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzZdat7GrtsGDVQenB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz1Dw0O3viJ8TDbWMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw87U76ibO8mRZbTXR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]