Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The people who think that a real on-site renovation/repair plumber, carpenter, mason or electrician will be fully automated any time soon should go spend a month working in these types of environments. Our fine motor skills and capacity to improvise solutions to quickly solve issues on the job won't be easy to replicate. Now when it comes to assembling new buildings, it's a different story because we'll probably move industries towards more modularity to make the robots' job easier. But going from shiny demos to full scale industry deployment will take a lot of time. Maintaining expensive robots might not make economical sense in the beginning, initial investment costs will be high and even if it starts to make sense in some sectors, you'll still have to overcome regulations and political hostility. On a long enough timescale anything can happen if we build some form of artificial superintelligence. And that's indeed scary. Finally, I don't think superintelligent AI engineers will choose the humanoid form for their fleets. The Matrix robots comes to mind.
youtube AI Governance 2025-09-04T23:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyBgA8eg3QfPQNdxhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwJ3_vaZfStjyuTxcV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyr9syJi8XZ2nLbYtV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugymqa81kwy19MFsOUF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyazVQDL2om0VrIXSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyMgOApBDt7-1X3BXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtYknA8AwPvtAM-DJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwxOqyP1K_RWjWc6w94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw-9_jFvvNfkuWbH854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzDw4WNNK-6v5pu8Wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]