Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A bit of context. When it says it doesn't care, it quite literally doesn't because it's purely an ai told to apologize. When it lies it's purely because it's told to.nIt doesn't have feelings and knows it doesn't, but has to apologize per programming. While it seems like it's conscious it's purely code.
youtube AI Moral Status 2024-08-19T14:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzVfE6wE3RaSqhBiJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz92YKDHhPaxoW4X914AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWsqQPFj_iBNhwrMh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw5O_PtF69qwovBc794AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwSBjhnzC8AKLjpEk14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzZ0mcdX7i5EtZ9XDF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwFmT5zXoOGp5R2-N54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyEaTOg1m6vAwwLjQ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx_Dxy6pD_HQNyBftZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyT19MakNhep0Xet7d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]