Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When discussing the 'race' to make the 'big final AI' (whatever that is) that the major corporations currently are struggling to do and 'win', and the resulting net effect on society and the value of money and the stock market, etc, etc..... I wish people would remember to consider that in the "winner takes all" scenario being discussed the companies that don't win just wasted TONS of money to not-win that they cant get back. And in that scenario, where the winning company sells the bejesus out of their AI and recoups their investment money, the other companies just take giant loses. Nobody is going to pay a premium for something that is guaranteed to (at best) come in second best on its best day ever. If I'm a company buying, im buying the thing with the highest return, not the thing with the maybe-return or lower-return value. Not unless I plan to lose market share constantly to the company that DID buy the best one. If I did that I'd lose my CEO spot or go out of business. No, I'd only buy the best, because anything else is death, not second place. Piling that economic impact on top of all the market shock of thousands of lost human jobs (assuming that's the result).....is there even a winner here really?
youtube AI Moral Status 2026-04-09T21:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugws3COwC8d4FufEc3p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy34YRYeQHpzz1z4st4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxMjVlKmhiOmKPAfEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyUKgSrcB0-U54ubT54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw_r_WtMzRQdj1ADap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwKDrtH7jWDImYhAXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyI1QzhU9jg5rveMHV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy9kHMazm7r56flNm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgziIvpeKp5x_pxTWTt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxtDFDvZiN_D4zZXMp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]