Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
stop watching these goddamn sci Fi movies. if you're so afraid of robots and AI, why won't you go and flush this thing you're holdin in your hands right now, that's called a smartphone? why wont you destroy your car? coz there are no movies in which smartphones harm people and they don't look like humanoids. if you keep actin like you arex the AI WILL see you as a threat. if you treat it normally, like any other alive thing, it will more than likely leave you alone. stop bein so fuckin dramatic and grow up. of course, there is logic to this point, because when you lose control of something you created(something that is a lot smarter and faster than you are), you never know what it's gonna do. BUT, in that case, why won't you guys won't get rid of y'all kids? they won't always be small and fully dependant on mommy and daddy, wait till they reach their teenage years and you no longer have control of their actions(you do, but only on paper. the reality is, it starts thinking for itself when it reaches about 6-7 years old. how do you know they ain't a threat? because they're made of flesh, not electronics? yup. that's THE ONLY REASON, period). ya'll so dramatic, cynical and ignorant. you are surrounded by computers everywhere you look, but as long as they don't look like a human, you ain't afraid, because they didn't show smartphones harming people in terminator movies. you're all so dramatic.
youtube AI Moral Status 2023-04-14T22:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxfD6D3l_39pUZ60BR4AaABAg.9pE7hRbzTeY9u3PaBAosQ3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzIoyQsPWZ7SKMswJt4AaABAg.9m4BicGD1tx9mlb2cksZn0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugzr4JOENzI2S65ikhd4AaABAg.9ldXNTJQOjK9le6AgRnpW9","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz5C4coIb_V3na7N454AaABAg.9lYH_6FUv7w9merEHkQT0N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxPMDbr_d8COh95MUN4AaABAg.9kyZP1jFKFG9liXroW02_N","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_Ugwi2omKM0-zqzX4Kbx4AaABAg.9kpvQ90SO349kqaVJh0t6k","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx27Mhft70JcXdcLYF4AaABAg.9kS-f-BUDbA9oVbez046a-","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx27Mhft70JcXdcLYF4AaABAg.9kS-f-BUDbA9r9XIGmCp3C","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxlq0HKnemwiEfa70B4AaABAg.9kF5yiHUCB_9womU8TlpZb","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxxmfl4QAqqO849c_p4AaABAg.9kC5gRKApnW9l0ycTssDSH","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]