Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Narrow AI seems like a great idea. Assisting human function is the point. We do not want something to replace us in every facet of life. YOu want one human to ensure the function of one narrow AI. For example.... someone automates growing food in a greenhouse.... all the watering and feeding and harvesting and manual labor is done by AI and robots.... all the overseeing of the narrow AI and a human checking to ensure everything is being grown well would be all that is necessary. Basic income will exist... and be tied to market rates and income levels and inflation. DO not worry about the money. Are you worried about what you will do with all your time? If you have ever worked in your life... you will have thought long and hard about what to do in your off time. Well... meaning is not just a job. Meaning is a result of cultivating yourself over time.... you find meaning through human experiences over time. We all need something to do, someone to love, and something to look forward to. Most humans may struggle with this. I would choose to observe humanity and write about human social theory because that is fun for me. I would find my life friend and maybe have babies... maybe not. I would look forward to having excellent social ties and new enjoyable human experiences. I would look forward to exploring the universe. NArrow AI and human supervision are the wining combo.
youtube AI Governance 2025-09-04T13:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyYS50ESWywGe28GHt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzD_tDf_xphAn-OiPt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwG-XvCX1XI8yI5oq54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwJzMxKwgdo8mu-F-V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzQkyHgJMMPo41elhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwx7DN34a9BczADmlZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz4mUuf9bcL6_OS-uN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxlykFN92CilMxfBtp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugymp2RCxgmqK39Dvy94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx2PQqHh0cjWBAsRoJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]