Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Neil very often doesn't answer the question due to long-winded tangents and prefaces. I really wanted to hear him talk about AGI and he spent the whole time on a preface defining AI. Disappointing.
youtube AI Moral Status 2025-08-25T16:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzAA3MC9UcR8jr_FtZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzqrdEU7lnWriTJTU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz14A5rJAd1rOvAHV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzIqRqsQc7pWE-Jjm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzQn-egPg5wOhWauQZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwPz9KI3h17pYu0WdF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxKcsUwARd95Gs0i_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy-vtJL40aMP04gScR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzk2oRTji4A3edi4At4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzRaPUVzGqpDhZ9ytZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]