Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Khashon Haselrig AI's that are in every way as capable as humans deserve the same thoughts of ethical concerns, but we of course won't necessarily be making such AI. Dogs have been bread to serve in certain roles. They don't feel unhappy when they fill the role they were engineered to fill. In the same way there are no ethical concerns about engineering an AI for one sole task and then letting them do that task. A telemarketing AI that doesn't know the first thing about science or music composition wouldn't want to be a scientist or a composer because it would be terrible at either. AI doesn't necessarily mean "human in a box". We may one day create "humans in boxes", but in the same way animals are getting more rights today, we should have gradient of ethical concerns leading up to and past human level intelligence. We may one day have AI that is equal or greater to humans and we may one day have laws that consider it WORSE to treat a post-human intelligence as badly as a human level intelligence. For example it may be legal to put a human in prison but illegal to similarly incarcerate a mind that is accustomed to debating quantum chromodynamics around the clock with theoreticians around the world. The point of both my last post and this post is that no one, neither human nor yet existent AI, needs to suffer unfairly in order to objectively improve the quality of life on Earth and beyond.
youtube 2014-01-18T15:5… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgiWsDgzf98Z9HgCoAEC.7-H0Z7-UsSH7-TgWFYDZxd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiWsDgzf98Z9HgCoAEC.7-H0Z7-UsSH7-Tj7TffuU-","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgiWsDgzf98Z9HgCoAEC.7-H0Z7-UsSH7-VdFKfObcO","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgiWsDgzf98Z9HgCoAEC.7-H0Z7-UsSH7-W-rVplSuz","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgiUp8D2QHj9c3gCoAEC.7-H0Z7-6ElA7-M5ud_9EQ6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiUp8D2QHj9c3gCoAEC.7-H0Z7-6ElA70_az8QDUkE","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgibeR0m4mjcFHgCoAEC.7-H0Z7-IZf_7-P4xa65C3y","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgibeR0m4mjcFHgCoAEC.7-H0Z7-IZf_7-k-1TBpTAn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgibeR0m4mjcFHgCoAEC.7-H0Z7-IZf_7-kCwaAisf5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgghzE1DPJYeTXgCoAEC.7-H0Z7-Qj5u7-R1kVpd64e","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]