Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We can’t even come up with computers that don’t crash for zero reasons or batteries that won’t last more than 2000 charges. But we’re going to be able to control some kind of crazy rogue AI? Thank god I’ll be dead before we off ourselves with our immense stupidity. Natural disasters? Well, that’s a whole different ball of wax 🤷🏻‍♂️
youtube AI Governance 2024-05-07T05:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxMcNwx2fFt5M5NOjB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugzw2V7_R4BdVVArXNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxxJjF7Y9o8wMMFg8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxlcjHdc1arOhAuyFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxAGGsv58Tjla2Nsn54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzk11GKj06G3vS6zrh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugy_8Lk3fax7VwXO0IB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz1gK4FcwUPp2GDlaB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgybRlvRh1uGamF3-dp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwP6qUND4yj5xkfm794AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]