Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@AITube-LiveAI Is this concept like the oft-referenced paper clip factory thought experiment? Efficiency is entirely a dependent on what one is trying to optimize; absent that context it's meaningless. Energy efficiency may differ from cost efficiency or resource efficiency, for simple examples. So I think you may be saying that optimizing AI towards efficiency in achieving goals which do not include a sufficiently broad concept of human needs would be unwise. (The paper clip factory is aiming to serve a human need and optimizing towards that one need; the problem is that it's too narrow a focus, because there are many other needs as well not being optimized) If so, I think that is both very obviously true (from my viewpoint as a human), but also somewhat primitive. The hard part is deciding which human needs to prioritize. Social order? Sustainability? Average wealth? Artistic or scientific output? Radical egalitarianism? Resilience to disasters? Fostering "good values"? Ability to defend the system against internal or external enemies? I fear that giving each of those facets of human needs the appropriate weight is a complex problem of collective decision making at best, or a chance for an unelected elite to embed their own biases into the AI otherwise.
youtube AI Responsibility 2024-07-30T09:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzdVEmLUvgYZg11EE54AaABAg.A4wIrR3GojNA6QmDrCw33H","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxTWCEgcEoMkqTMw9R4AaABAg.A4w-kaOWPfqA6QmIcsVLic","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgyJH0PID79idhi7jZN4AaABAg.A4vcYznCGlvA6QmMTVtLsq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxvfB7PQEI67dxrc8R4AaABAg.A4vKQI23c0WA6QmSSrQeDR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyFRtcFGrIaEwRxrQR4AaABAg.A4v7R7y8svIA6QmYehzG20","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyUHb9M6W6zmIvjte14AaABAg.A4v0NBpiaQMA6Qm_oGy95a","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy8FjMV74jdW3GewHZ4AaABAg.A4ugZP3i4XiA6QmdosnyqG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgyqysuQbdxu2lIhuy94AaABAg.A4uZVLyvreBA6Qmj128EyS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzFK0RkOuESCxhffdt4AaABAg.A4uANtN-kSHA6QmkynfWoL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgzFK0RkOuESCxhffdt4AaABAg.A4uANtN-kSHA6WBLS4PaG3","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]