Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I disagree with Neil deGrasse Tyson. AI is going to lead to problems. People aren’t going to magically find some alternate solution, alternate jobs. Automation is the problem. It increases efficiency but also eliminates jobs. This works fine to a certain level. A portion of the people can find jobs doing something else, such as cutting hair, dentistry, programming. Automation will keep eating away at jobs and we can’t all be cutting hair since eventually, that will be automated. When a Walmart opens, a lot of smaller businesses close. A portion of those people can get a job at Walmart but Walmart won’t have a position for everyone. You can’t ask those people to become programmers. There is a limited number of such jobs. You don’t need as many programmers. You need far more people making CDs, printing the manuals, putting them in boxes. Actually, making CD job has been eliminated as well. You need a few engineers to design your next car and you need far more people to manufacture the car. Once they fully automate the manufacturing part, guess what, those laborers won’t be able to become engineers. There is no room for them. You just need one guy to design your next pen and you need 1000 people to run the manufacturing end. If you full automate the manufacturing... Think of the far future. AI is not sitting idle. The tech will keep advancing. Eventually, you’ll have androids like the Data of Star Trek.
youtube AI Moral Status 2025-09-05T02:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyebkjdAMjHTKAlrjF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxY49PEj7blKx4lgdp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz3q8LE5lTMa2HL0T54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz8YLPihyTX9ZQ_o1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyAHyISfQK3Y-n8HkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyzuj9flhw9ohujDH14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgygcK4fIf6o7GxZ6FZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwMzzO-Zd4K_EMPzPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxKPU25GMbUP12aed94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxSoO_hrnhrfFeaZt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]