Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It’s not so much that it’ll choose to become conscious, it’ll just happen. One example would be system error after system error pilling up, making the ai conscious. That’s an example from media, but I would imagine it would happen in a similar, accidental way by just stumbling into it and not being able to go back because of the consciousness
youtube AI Moral Status 2023-07-08T11:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwEJr3pVO5RtU6kjiB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypRjDkn1OeS35AmgZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxlpetJmo4mLp47sb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwt5RVHRi0sOatjA254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxN0t5D-ULsgCxX2gN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzXklXQklwrZMUUJdh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxswGOEEhSxWyj6rEJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzDBTyHgMiWNlf3g_x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzYd23FYG7ZhjSLs2J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgygsnL6Vp4e6XGNeX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]