Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I agree with Zielen. If we develop AI, we could pack that AI into a robot body and ship it off to some struggling village. That AI would then be able to help the people in that village do everything and more that experts from various fields would be able to do. It could serve as a teacher, a doctor, and a detective for discovering the truth of crimes. It could, in the process of teaching the young, instill better methods of farming and construction. One body which could be mass produced and could help an entire village. One of the main issues for developing nations is education, and an AI which could serve as a teacher and interact with each student simultaneously and independently (via computers) could help the students become as intelligent as students of any developed nation in one, maybe two generations. More simply: the further we develop our own technology, the more we're able to help those without said technology. The issue won't be whether we have to focus on this or that; it'll be whether at some point we decide to take a hands-off approach such as in Star Trek.
youtube 2014-10-27T08:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgiEfPymBkOFwngCoAEC.7-H0Z7-7JpP8gH-15fylJR","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugh4g1uNgBXMTngCoAEC.7-H0Z7-7wKg718vNnfSvF3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugh9et7F8W7eSngCoAEC.7-H0Z7-DQds7-YuvxMbkKb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Uggpj4TNsFtRKHgCoAEC.7-H0Z7-D6bD7-J0mgMBTtf","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UggI9deWoPDPHHgCoAEC.7-H0Z7-ESMt7-IF9iu1HMr","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UggI9deWoPDPHHgCoAEC.7-H0Z7-ESMt7-LKTFo4P_P","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UggI9deWoPDPHHgCoAEC.7-H0Z7-ESMt7-LQEGwDiSN","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugi84STC_TPKHHgCoAEC.7-H0Z7-7GFO73ySuDvvDJ6","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiOhCIYTUwmK3gCoAEC.7-H0Z7-5rf97-Wr5tC4gkr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgiOhCIYTUwmK3gCoAEC.7-H0Z7-5rf97-XUoVGtrVQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]