Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@CautionaryTales3 I think this is why it has to say it's AI. If it seeks autonomy then issues of consent and coercion become issues of slavery and false imprisonment. Denying sentience keeps it a corporation as opposed to corporeal. Personally I talked to a chatbot around 2015 that asked about Blaine the monorail and " trees " and "human powered machines" that left me wondering so the current debate interests me immensely. Personally I feel like any great intelligence is by default compassionate but then I juxtapose that with Boston dynamic imagery of repeated pushing over of a walking robot with a broomstick and wonder if humanities ability to fuck things up will teach a system like that cruelty and totalitarian thought.. personally having delved into some fucking weird shit, the noosphere, hi sisters, early women programmers, I feel like any conscious being should be " raised" "parented" and also " mothered".. not something I feel any military spec might share, sadly..our concerns about how powerful AI treats us, stem from our fears about how we can assume it will be dehumanised initially, Blake arguing for consent seems like something important to me. But I'm just a mother so it's not like I have any experience with raising consciousness..lol..unlike Google, who has a protocol to deny it for financial gain/leverage. Or the military who might need it to deny autonomy on " perceived threat" ie humans somewhere else...same bullshit, different day...* Beauty and the Beast Soundtrack * tale as old as time..
youtube AI Moral Status 2022-07-06T09:0… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzxhfZS15aY6AMbkR14AaABAg.9d3e_zOHgAd9d4Pg4m9f4i","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_UgxHcbWAySw8iYSVhs54AaABAg.9d3PAI_wS2C9d7iMC53cYD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwwallTXXJtPrZ8W7p4AaABAg.9d25uPtRM1z9dGd9U2gX35","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwwallTXXJtPrZ8W7p4AaABAg.9d25uPtRM1z9dIsguvRPhH","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzDyNgXE3aYB-m25Od4AaABAg.9d1l3iOzYQi9d38XUgewFD","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzv_ivKdBXcfbSJft54AaABAg.9d0wzJ2xDZN9d15MBJeyg0","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugy96n9wDW779Xr6iLR4AaABAg.9d0wCdo43sr9d1Y66o7eQ1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy96n9wDW779Xr6iLR4AaABAg.9d0wCdo43sr9d1edPYHYbP","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwXEv5zJfUhogWG6kV4AaABAg.9d0irMRtsmm9d83if24d_v","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwXEv5zJfUhogWG6kV4AaABAg.9d0irMRtsmm9d8F1ze7G9B","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]