Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Contractualist ethicists with PPE (philosophy, politics, and economics) backgrounds watching this are just blinking with concerned brows. So … really you’re saying our jobs just turn out to be harder than you originally thought but you’re going to keep trying to do black box A.I. with absolutely no guardrails? *rubs forehead* k, no. We teach why what you’re calling “everyone’s preferences” are not equally valid or important in intro to moral philosophy. Moral and Metaphysical relativism is just Solipsism and therefore not the same as basic descriptive relativism (saying A is different than B). We actually can weigh preferences on a moral scale with and/or without empirical data. As for justice and autonomy discussion, welcome to philosophy. You have now entered the Idealism vs Non-Idealism discussion group. Spoilers, you can’t train A.I. to understand bottom up thinking required to comprehend this debate to know how to find autonomy and justice. It is a horizon of ethical aims. If it can’t understand that, then it makes the same failures top down idealism made and our human interests are not served any better off than a human capacity. Can A.I. help us sort through our own work faster? Sure. Should we let it off the rails? No. It’s unreliable on an epistemic level of trust.
youtube AI Responsibility 2025-04-17T18:1… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugyv2q9CpOo9X5Og7Zx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzSuR2smrOKkuzOhA54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzcFjUNNtgdcbM1u0N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyPYK7cgGwNaDxqVFh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzd-LWU05WAem-UIAp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzYINnKXoZJaPtx85V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzpquI3eTl53vL2JvF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx4c-9F0Y7Tc-y8Uwt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwvBkKC5Y-KlPmkUTt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgziV5ex0wBSuE4kJDZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]