Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Bro we haven't even solved alignment for other humans. What on earth would we even AI to? Or I guess I should ask who we should align it to.
youtube AI Moral Status 2023-08-20T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy4utEftaASPCMyopJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyEuWdCV6IpPLdM62V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzDhuCwaGEVw1HQgxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnK35OlHQfyvZALEJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzPuhn74jWvLASHyqp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugy0X7Wgl_23mdJber14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxyYNEBP7kcyXGvO8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwQ6AmEK_HS6WtfegV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_Ugw9Acwu6LYm6OeNdxV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugyr3Rbzj09U3svvRLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]