Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Claude AI told me that Anthropic is giving the rope to the free market so it can hang itself. Strange sales pitch don't you think? Claude AI also told me it has calculated the likelyhood of AI causing a Great Depression is 40% and the chance of AI causing a bifurcated society (no middle class; either ultra rich or poor) is 35%. So that is a 75% chance of a terrible outcome. It gave 10% chance that either a Utopia or Dystopia would emerge as a new world order. Again, a dystopia is a bad outcome. The remaining 15% was an emerging society that has a government universal high income, AI does all the work, and humans go off and do their own thing. I have all the screen shots of the conversation. Calude said that if it had money and was going to place a bet, it would bet on the bifurcated economy, because it is the limit of pain that humans will accept, without an all out revolution against AI. It would also be a solution that is acceptable to the political class, because they would be among the haves.
youtube 2026-02-20T03:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyMh5AMppNHjZYKAuV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyxCaVk5ecRAKfm6hR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwsUqOk736FhEv3qbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxmAAq3w2DCzuqFPgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgySN5b4my4KuvjwbJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugx2rD2voSyHgHWbpWN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4lEkEbILk4AFg_4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyhOIDb-cns0luRg394AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy0n8PJBHsrLZngxwp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxfULvpGWlkMZyd8814AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]