Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No AI does not think at the current time. What it does is match your inquiry to the more common human responses. If you ask - who was the first to have controlled powered flight - it will say the Wright Brothers - which is incorrect; it excludes lighter than air for no reason other than most humans do. The proof of this is that current AI models have to be trained with human inputs. If there were no human inputs - say how to drive a segment of roadway - AI can't figure it out - it uses human inputs to determine what humans would do - humans are wrong a lot. IF Ai thought - it wouldn't need human inputs and would find the facts based on real facts and not human opinions or actions. If such AI ever came to exist - we likely would be killed off as being a threat.
youtube AI Moral Status 2026-03-02T06:0… ♥ 6
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwiEDvqTesk_UlEzih4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyOxOsuEn2ejy8jzAl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwJ3Hkh826nX_zG49N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw-23u46madYKseenJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwgjwzYQHflu2xOGY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxbntoRWLqdexkk_054AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyzcQCCev53NnCgI4N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx34S08LynliShVHm94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwhQLZ_nBt2L9ydsUB4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzEiHxNOe8jKUkkkhJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]