Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is a technology that only supports the reduction of jobs, and the removal of survivability for the human race. It promotes the ability to pay lazy and greedy people, instead of the hardworkers deserving of survival. You should not be allowed to buy a tool that lets you AVOID contributing or colloborating with others and then get paid for it. This takes finances away from hardworking families in an overpopulated world that need to eat and survive. It creates more crime and poverty. People start justifying not learning in schools, because they don't have to. AI is not a good idea when you have an excess of human beings to do those tasks. Start hiring the people qualified to do roles...then society will begin to function again. Stop letting people out of their responsibilities to their employees and their communities. What happens when nobody can afford to pay for your AI tools? When they fail, and there's nobody there to correct for them? Because nobody is trained in how to do it or has experience because they've never been employed and started on basic tasks? By pushing this technology you are pushing us down a terrible dark road...be careful.
youtube AI Responsibility 2024-01-26T14:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzSU3yf8WbyMkULY5V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxuc1K6yvX_PdHY9Fd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwkz9uVXn1pR1b17zV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx3dqP2XSLz9rmO3314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgynYKNt2FexsnDDcb94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzimf8YXA6FdtDNlBd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy1VOIYYhsTxVaTPo94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyha-sRfkXDD7oG9A94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgydKcoO1WDXsuQrY6R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwgO7SDFnfb-0-JaK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]