Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ChatGPT doesn't have consciousness, emotions or self-awareness as it likes to remind us, but we would also do good to remember it doesn't have a concept of logical connexions whatsoever. When you say "so it follows", it means nothing to it. It will just compute the most likely correct response, but it can as well contradict itself ten times in a minute and stand by it.
youtube AI Moral Status 2024-11-30T08:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwQzInPu06-8ffP6N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyVyf-NsFVXakeViOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzC1PtUu4j9gUru0kx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxgldSXVWlRp0w-nKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxN5kEzqkpY1NNjGvx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8DgrB6AWt0BMg3zt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8qM6QeOBOJ2pkxn94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyyj1WdBOChR0LBlm94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy4WxD57BFSl8HcTkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxWNe7Xx578wOJnc594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]