Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When people (like Elon) are trying to seem intelligent by simulating algorithmic thought, they are just becoming narrowly limited
youtube 2026-02-04T07:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugy6xL731CkZn2duzmt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxXefqgLnN74baPYg54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxHUaxq3AR10pgPo3V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw6KuR-r10YAJ9f0-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwjxCX6tWu1DVSogcx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyvVz6xt05x5J2MG7h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw1qWdnal_boPSNz394AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"}, {"id":"ytc_UgzGotAiALazgrCoL9V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw9ip983XYLLqJ9qmB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxAzqPd9aiNlFTso7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]