Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it's sad that humans would not write code. I built electronics and wrote a program by tapping wires for ones and zeros. Then I used assembly. Then a high level language. Then my own language. I like to think I could do it again. Sometimes I look at the assembly. I like hand made stuff, all done by one or a few humans. I liked to grow my own food. It's called resilience, that's being a good human. Even tech guys have a word for it: "vertical integration". They think it is good for their companies. However, they want to destroy our integrity, our resilience, they want to make us lazy and dependent so that they can profit. With luck, we may use AI to make us humans better, and design a system in which everyone becomes more equal and has access to more resources. I would have no worries about AI if it and the profits it generates were in the hands of people, not companies.
youtube 2026-02-08T13:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyto9ptKRbtLb3ilit4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwcj3c6_jV5J1AQvWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz6okYPmU_Y-mrj5q94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyeQo42mVMzXrGpXaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwBhZsnzAJ8vVrYq8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyiWEKdr3i43xgQL4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy9TixES5hDJvrqDPR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxC5R-43fUn2nYhDoJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzkHLrkFKHNXtDqjbx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwdq_Dnimco9WTi8Z14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]