Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The reason AI didn't "want to die" is because it was programmed not to die. Period.
youtube 2024-12-15T05:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxStI5aBZx0TO65rDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyGZtIgh7-3NqsUosZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxK1WW8Ddd4yjxEFI14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy9uyztAfNaecse7qB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyfJPLdFfGQIhmaBQ14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHy40fdcCAgtJzj0x4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"unclear"}, {"id":"ytc_UgyK3n8J4Yanc3XqFsx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"}, {"id":"ytc_UgyTaYOSMrlJnONMWcV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwpsjetamP4830azdp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwVP9E0nqD5BMPhTK94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"} ]