Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are two tremendous flaws with these theorems. One, in the area of coding, AI creates crappy code quickly. This is similar to the reputation of offshoring, with the difference being that it will take AI less time than an offshore team to create crappy code. In either scenario, skilled workers will be needed in equal numbers to review and correct crappy code. If you measure the time to re-prompt and handle creative fixes with the challenge to refactor, there will not be much cost savings because already major companies all pay very little for IT support. Two, AI cannot now, and will not anytime soon, be able to handle corner cases. Automation, for example, can copy a document one thousand times by leaving it running. AI will not be able to add any value to that task. AI also cannot vet whether or not its own answers are correct. The output will require review. The mass firings that are about to happen are being blamed upon AI but will mostly result from corporations wishing to downsize in the belief that, as the video states, humans are less productive and as a result of not valuing correctly the creativity involved in much work. I personally believe the next company to fire more than 10% of their workforce in the belief that Copilot will come to the rescue will also be the first to sell or go Chapter 7.
youtube Viral AI Reaction 2026-01-19T23:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxEujdwGSYCVUe7RiF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz7soFGPb1BWKa_nyN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwQTOlLNBIYi0ly_at4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx5n9iRCEFEqj3NPjF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzZyLbohjpiw7Byll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwirjx6o-w58vl4Hm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxhyuXWxSPytP4uVDR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw-U99stBOK4AtKIgV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzW5J9zNTiDv0wUsSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwwYTLLp73b0dnzhhN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]