Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you realize this is a simulation then you realize what's called "ai" is just a human program in a server/android instead of a human program in a human body it's a little f'd up. @startalk
youtube AI Moral Status 2026-04-05T08:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxJYEki34MndK0kVIl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzbqoDhIpr9G1oL6i54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyYDiF1GVialVMFPaN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw4meNGb2pgv2q6J3B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy_-TfhKJVV4zCGM8d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyJtfuJpLKjEU0DUUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzycjQSHE4DyjwLcPt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzNpKaNRb0GoQcAl4V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzkkDMVslNTkBB0kt14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzalt5UyU_12amqinF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"approval"} ]