Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This kind of thinking is not "planned well" in the Western market. As with any technological gain, there is not enough incentive or drive to "skill up" workers. While there ARE system to skill up, the issue is jobs. While some technological advances generate new jobs, some of these newer jobs need less people. Take the system administrator as an example. In the past, you needed a local team to handle locally installed servers, now you need 1 or 2 to handle CLOUD base applications (maintain) but most of the time, it is running on their own. now, those people who use to be System Administrators change/skill up to system engineer or coding for cloud base application, but then AI comes in and can build a basic "decently" and need a 1-2 people to "cross check" Now insert this idea to other working environment. Car industry is a good example. You need a huge team to build cars, now robots are in place and need a smaller team to maintain the robots. You use to need a team of sellers, now people can buy online without meeting a person and you just need a finance person (if that) what if it is all AI driven and you can buy the whole process without need to meet a person? What happen to all those car sell people? what they are going to do? (again, insert job here to change to other scenario) Couple with the idea of "just good enough" people have accepted a lot of things, just good enough and go on (like AI customer service)
youtube AI Harm Incident 2024-07-29T15:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugye5s_c6WHA6XWmQ_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwc7eoF6Sf0d391T5V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYBzXhV88xKYxA_8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-LBpTOsYr_VoZ6JF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyp96x2ezuFFPGJFx14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhS3yHaXa0hkHZcf94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwrQ9--SzvqI1s7BEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz0iEE1-reCMurxKGF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzfaMyoOMLgZLZ-ij94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwQS0DYbJIfLm89wfd4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"fear"} ]