Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
5:47 I have another opinion: it's not that you didn't realize this early — I believe you did realize it, but you didn't care, because you wanted to be famous and to be seen as a scientist. You saw this as an opportunity, and you believed you could make it happen and turn AI into something real. Basically, you ignored the dangers and didn't care, because it wasn’t your priority
youtube AI Governance 2025-06-27T11:3… ♥ 36
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxJO-QVll_iexAsiYR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgztEGX6UuOKGEGyXOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugym3FHA5CgmXpQKLdd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwzAakxNAzFVQo26tl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxg2CzZ2GSRistsecx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz0VAp--8pu4Cm3XzJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwLKKrN2wWeh_JBhqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy35GzvJrDNpbCsVd94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy1tjUaOr_vQXTWfcZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzdz4atWQhxgjnmtm94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]