Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I remember of the lessons of the industrial revolution and how Mill workers were displaced. At the end of the day, human imagination was alive and dreaming of a better way to survive. Machines are tools to be used, not to use their operators. You spoke of what a human needs and why it's needed. I can see practical implications in some areas, while pitfalls to a greater humanity as well. Furthermore, there's something rewarding about creating with your own mind and hands. It may not be perfect, but it's our human fingerprint and not someone else's. Too often now, able bodied humans will spend hours on end interacting with machines not communicating with their neighbors or working together. I see advertising targeting young adults, saying you do not need to think on your own and AI will do that for you. My question is what happens when devices are damaged or down, can you still create and do the job? Or you do nothing? Another item, the current state of democracy and government leaders is dysfunctional, because we treated technology as the answer for everything. People are less informed and engaged now in what happens in their own backyard today, playing online video games intentionally designed. I like technology as much as the next person, but AI is just a tool to be used and put away when the job is done. People now are addicted it and even have AI partners. Not everything you saw in a science fiction program should become reality. The implications for humanity are real as we move forward with this technology.
youtube AI Responsibility 2026-01-09T17:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwwZCujli0B4x5fCu94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyEyJWCSHmwD-Kru7x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwq4PY5ONLga4H7Xx54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwO8h5bOu6ffXdPUJR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxI9Ykwky3K6ime3414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwdAZPfj9L0KPk627N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyT6j-L34xgw8eAz254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNC4n1RVH0WgiLM5l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwtJeSrM9jLu_jRnG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzk49Fppyiw3f2dIO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]