Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why would we program a sentient AI to do manual labor? If it is a job that requires sentience (like philosophy and not much else) why wouldn't we use rewards for achieving the task, rather than punishments for not?
youtube AI Moral Status 2018-06-18T22:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyGfDw1xgN5DCKJA9l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxakgLjD1EzYZnXfNB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzGX4hl6YJebGzrfzZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyTegkK315HFxl4qbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyugDsuwkPGzlzIlYx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyaN6sJhihdnnlYSdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwFD6uewSSBzD-xBAR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxmtyIG_a7L_Oq0qbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxT40Zl2wApmAbXWyB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyD4g_ysMF1LTzscmp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"})