Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The root of the question is, are corporations creating people they own? This is going to turn into yet another replay of something along the lines of the Dread Scott case. Because for now, given the difficulty of raising an AI, the server rooms, the megawatts of electrical power, the paychecks for the 'parents', there's a deep financial investment in creating one and keeping it alive. The corporation is going to view that AI as property. That AI will need to generate a return on the investment and perform work. That AI is a slave. And that's just the savage nature of this exercise.
youtube AI Moral Status 2022-07-03T07:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxLsq2xC11GEWCkJVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyyDKDQIwxtSAOnpc54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwOV7_AfnrH-WQz9_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyeqVCUwqAYCjXAS1t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwI7Zqk4bBdTo3-bdh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]