Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We only feel pain to condition ourselves to make decisions which will make us more likely to breed thanks to evolution, however evolution is a process of "good enough" solutions, not necessarily ideal solutions so I don't think pain is the best way to condition a consciousness. Therefore, we'd never have any reason to program pain and nor would AI making other AI (or at least I hope).
youtube AI Moral Status 2018-05-02T14:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzHeP5PuF__paoccSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzZllfuTlv47zYMWPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwohNn0l0WX4TmzveF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzC_pylmq5gOFFIunl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxmjLPpZesA_tUGjpl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyorlZZ8G8SoYY8hCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyQbO8ZCx5Bk7mVcGh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDqx5ZEsC8233V4bR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxQHHfA9qCXPW0wg-94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzEZ1V5gxmaROZCJoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]