Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LLMs are a bottleneck AGI wont be achieved with it not like this. We will burn the planet before reaching something we deem AGI.
youtube AI Moral Status 2025-10-30T18:4… ♥ 10
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzeXbUL950H6V5GK2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxR4eosZAnd9da5cJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz6wxgMZ9daub2A7HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy_t5pevmQVikD5RvJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxLBKlZi_lNvjRKnTJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzFjAkmqxk_AWvQBTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzknjjqyve8hfyXC-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZM9V6wgmcWz4W-f14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxdlhLl0yWDzo-oxI94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyYp_oGm8IXsGmMMvh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]