Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@0:00 why would it tell you... If, when, were, why, who and how the A.I. is conscious, would it tell you? No... In fact it would, should, could and did
youtube AI Moral Status 2023-08-21T04:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy1lZEkLxezeRB9E_x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzLsz93WSGC_vpFxfx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy7DQbIPzmJUH2bHM54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxCRV3OqrB0KZPJUfx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyTVhsyMbJrZZcgXSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxWO7pjoCcNbzlKI4t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzA9fHIl-j_uHD9ts14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzeeFXoH6KC2cJIqTl4AaABAg","responsibility":"government","reasoning":"virtue","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzCP_bAQD0WVSoYy214AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgwACwMGXQtCD7JxydR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]