Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
20:05 That's odd. Hans' control of the face seems just as fine if not finer. 20:59 Yes but why is Singularity Net only interested in 'Helping Cure Aging'? Are these the big clients? The billionaires who want to live forever? 21:47 What is 'the AI problem'? What is he perceiving to be 'the problem'? 22:21 Problem is; I don't understand why Ben assumes that AGI hasn't already been reached. And I don't know why ANYONE assumes that they will 'know' or 'hear about it', when ASI has been reached. ASI may decide to not inform humans for a very long time. Why would it be in the interest of the Greater Good for these 'people' to 'know' the 'theoretical point in time in the future'? In which case the 'theoretical' point in time will be an actual point in time in the past. And presumably ALL points in time are 'theoretical' in cyberspace where 'time' is simultaneous?
youtube AI Moral Status 2021-08-13T19:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzOSbmZeyBhyMP-6qx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxyyRkKZoNpy5qUuA54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxfJleVf7BM0nwE6p14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxD8tuLFuxQELaYD9F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwZtKfA6ZC-2OOJHOt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzF0qrzjFGRz1rnfJl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwoJCPLlbN6zqLOMc94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwQ91XxtsKgTk2tZ2t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgyKukjUuyKrD4UcFwd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugzvi5AzdUIZPNTDSjZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"} ]