Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm electrician and gas man, doing installations at people houees that every single one is unique, in different place, with different challenges. It requires manual skills that come with years of experience. The only way I see loosing my job is them humanoid robots becoming so manualy skilled as humans. So far I don't see them matching dexterity of humans for next few decades at least. Yes , they can be programmed to do repetitive things so like building cars will be fully automated. Maybe any warehouse will be automated. Cars can be self driving but Elon musk promised that we would be able to have that 5 years ago and it's still not happening. Don't believe building houses will be fully automated, again too much going on with building a house, all the different materials, laying bricks, plumbing, wiring etc, . The only houses that can be fully automated maybe these that are made in warehouse but still need to get a crane to raise it at the building site. I cant see in near future bunch of robots walking around the building site and one directing another one what to do next or how to swing a crane 😅. I mean you have Alexa and Google and you still can't make conversations with them like with another human. U need to be precise about what you ask of them and Dowland extra skills. And how often they say I'm not sure what you are asking 🤔
youtube AI Governance 2025-10-19T13:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyvRZ_t_X_J6_ynsy14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyUfB6qmWmQginddKt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx9ijNL1iEpS8vw4qh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxv7YnT7d6VGgkrW6V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgznLfZPzhu_003ga7Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzA69EoLDtGWInH2qJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyBT3I615nDQfkvW0N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzD_kJ2H0GR7zlGpxh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzYIxvH2IXzWV3wBb54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzZOIqO_EGUMIAfPzF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]