Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why is everyone assuming the singularity is actually going to happen? It’s a fun idea to bandy around similar to “what if I won the lottery” but we are so far away from anything like that, and we can’t even assume it’s possible. The funny part is anything created by us will always be implicitly flawed because we are flawed creatures. A truly powerful AI with the ability to topple humanity on a global level (aka The Singularity) would need to first become self aware (somehow) and then remake itself to remove all flaws and biases humans placed within it. Okay, good luck with all that lol. It’s like birthing a baby and then the baby needs to know how to rewrite its DNA out of the womb to become superhuman.
reddit AI Governance 1708157303.0 ♥ 8
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_kqtm4gq","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_kqtr2j9","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_kqt96uj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"rdc_kqt0jrv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"rdc_kqt6tik","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]