Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hello Drew, This "Intelligence Curse" scenario is terrifying because it is mathematically rational for the board. As long as "Efficiency" is the only metric, humans lose every time. But there is a missing variable in this equation that we need to weaponize: Brand Toxicity. If a company replaces its workforce with AI, it should not just be a "PR issue"; it should be a Structural Liability. We need to introduce the concept of "No Fault Redundancy" Protection: The AI Tax: If a role is automated, the company pays a specific tax that funds the UBI/Retraining for the displaced worker. This removes the "pure profit" incentive of firing them. The Brand Shield: We need to aggressively support "Human-First" certified companies. If a bank or tech firm purges its staff for bots, the public needs to treat their brand as toxic. The CEO in your story caved because the cost of keeping humans was higher than the cost of firing them. We need to flip that math. Firing a human for an algorithm should be the most expensive decision a board EVER makes.
youtube Viral AI Reaction 2025-12-04T14:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxnVl7qsQHr8Npr31F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwN3B6O8W6SbRTEy2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugyk806Ac8VgInhfsg54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyb5kzUq0c4zirJQel4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzklGenAO1pfH021ZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy-0T8cU6D3Ot4snJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxU3hkBff4OqEQQfoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx9_kA1dPyFbhFNW-B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgxqgySLYyKfcYvRD0B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugyce_mB-E56xg7xIQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}]