Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sidelining the potential dangers of AI is going to cost us. Everyone has been culturally conditioned to believe in a complete apocalypse, but it doesn't have to be so sinister. It may be a malfunction. It may be unintended, unforeseen behavior with interacting systems. It may be the sheer quantity of disinformation that is about to get pumped out. There are any number of problems associated with AI development that are REAL THREATS in the near future and we will have to decide NOW what are the standards and safeguards we want to put in place. We should start with things like pausing the development of super-powerful deeplearning algorithms to develop new neural networks that allow examination of thought processes instead of relying on black boxes that seemingly put out the correct answers. We should develop systems to analyze AIs. We need to understand exactly how they work and then educate people on their basic functions. This may be a turning point in human history, at the very least on par with the industrial revolution and we need to prepare for unforeseen consequences, because there will be a point where we will not be able to keep up with the sheer amount of data these AIs can analyze.
reddit Cross-Cultural 1680873047.0
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jfbc536","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_jfadmn3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_jfbajlu","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_jfbqg8j","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_jfcnvqv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]