Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why not just treat AI the way we treat everything else, and just wait until after it becomes a problem? It's working for BudLight, isn't it?
youtube AI Governance 2023-04-18T07:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugww0jXDu3RzlonYwjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxfb0Tya454pQT0A914AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxkwMIhLpk2j0wVyV54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwv-gYPkk0owwIggLV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx2YnkDQSSYi3zZ0994AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxFco1cl3Cqd2Doh814AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgypfytreuxykgDomwV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw_ZmgXgBrfzAiRFwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyfnQjIzT9mf4kEXxh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxcdhZ4WC_HVCdnYoZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]