Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As I've mentioned earlier, yes morals are important, but not as important as the science behind it. We're talking about machines, the stuff that WE invent for OUR benefit. If you can't make the tool, why even bother thinking about how to treat it? Scientists are focussing on intelligence rather than emotions. Intelligence is the ability to draw conclusion from given data or having the skill to reason. That is needed first, because there are lots of problems among us and among our understanding of nature that needs to be resolved. We are bounded by the limitations of the human brain, we want something with better thinking capability to help us with our problems. Although inducing emotions into beings may also help us better understand about ourselves and our own consciousness, but its not urgent. Just think about what A.I. can bring, maybe a second industrial revolution? Awesome scientific breakthroughs? Make humans a type 3 civilisation? Extend our lifespan ten times beyond what is the average lifespan right now? Yet so many people are standing in the way of this development.
youtube AI Moral Status 2017-02-26T02:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjLxrOYZeejZngCoAEC.8PMrCOh_WlP8PO9vFpJHID","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgiVpfQj89rjKngCoAEC.8PMnhumW13f8PN9y8DTfnA","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ughhc4tyNo5_AXgCoAEC.8PMkNeGZPj08PNBctcwzYf","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_Ugh-Tr1Sq_E_FHgCoAEC.8PMiIj2qZQP8PNkXaBst5k","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugh-Tr1Sq_E_FHgCoAEC.8PMiIj2qZQP8PNmsD1TZH2","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UggJXPMrGWhAjHgCoAEC.8PMe5OU3nm-8PNgRoCLtIK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UggJXPMrGWhAjHgCoAEC.8PMe5OU3nm-8PNiBFc9N9t","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UggJXPMrGWhAjHgCoAEC.8PMe5OU3nm-8PQpvl1uO8l","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugjn2HuW9ataL3gCoAEC.8PMbyf9yAOo8PMufIkpGiW","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UghFccuO56tzBngCoAEC.8PMKX4oTf0X8PMLdWg0f5o","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]