Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The "Superintelligence" crowd is obsessed with the idea that an artificially intelligent machine would have the capacity to design a more intelligent artificial intelligence, and if the resources were available this series of increasingly intelligent machines would eventually lead to a superintelligent entity that would be beyond human comprehension. If we are told that a [hypothetically] superintelligent machine would be 1000 times as intelligent as any human, on what terms would we be making such a comparison? Does intelligence have a unit measure, so that this entity over here can be said to be twice as intelligent - or 1000 times as intelligent - as that entity over there?
reddit AI Moral Status 1487783914.0 ♥ 5
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_de2k3ds","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_de2mgs2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"rdc_de2q9jg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"rdc_de2tjrw","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"rdc_de2txwo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}]