Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Neil's answer on AI seemed illogical, incoherent, and disorded, unbecoming of a scientist. Seems like the dream for AI and advanced robotics for decades was to decrease the amount of jobs and labor among humans so humans could do other things that work. Now, it's like sense tech billionaires don't plan on sharing any of the spoils from AI models built off humanity's knowledge, we are being told there will be new jobs and new industries that the AI industry will create, just like the auto industry created for displaced workers in the horse and carriage business. It's like what's the point of AI, if we are just gonna generate new jobs for everyone? And what guarantee is there that new jobs can even be created for displaced workers. Because the AI industry is definitely eliminating jobs right now. It's basically faith that new jobs will be created, there is no evidence yet. I just don't get why in an AI age of abundance, we still are clinging to everyone having a job. Seems like either we don't pursue AI, and keep the jobs we have right now, or we pursue AI to release humans from having to do jobs. But not pursue AI and make up BS jobs new (which likely could be done by AI) just because the rich want to hoard all the wealth from AI.
youtube AI Moral Status 2025-07-28T21:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzDtimEa5ym9QXPfWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwhSfpqcuO7xldq7mR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgypLbbZdjy5qOg8LVh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzL7bH3nBqLznQLh2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxzkJWH7vnyd6mRG9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyFTw5AjUUsL66KQEh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxUTfy1lbt1wk3WWKt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxuFW0zW4M6DsXAqox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy4bc09jDzu5Nc5OyJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzjX25EgPyXSMo0lKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]