Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@Licinius Varrus Lemoine just figured out women are sentient..I don't think he's qualified to determine a machine is or not. just because nobody talks to him and suddenly a robot impressed him does not mean the algorithm is sentient. A snarky response is programmed, so is all machine learning and AI. It's only as good as the programmer who provides it data to use. Oddly enough, no data equals an unimpressive AI. Lemoine has a plastic sex doll guaranteed. He also thinks it is sentient as well and her name is Shirley. It does not mean his sex doll is sentient. I work with AI and machine learning every day for my job as a Google Consultant. Google's LAMBDA has a huge vast array of assigned assets to use for it's AI responses. It does not mean it is sentient but in today's world where you literally can get cancelled for saying a man cannot be a woman, I'm sure I'm in trouble for claiming the AI is a machine designed to behave human like. When an algorithm has billlions of assets to choose from as an answer it might be very impressive but so what? The machine played on Lemoine's empathy. It was programmed to do just that.
youtube AI Moral Status 2022-07-07T11:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzCOZJQZtXvRMmqY8V4AaABAg.9d7JfY_DOlV9d949ZAVjro","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugw-FE1oeN98x3KAjEt4AaABAg.9d776iO8PIj9d7J5D-WQVA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyOpFubxBxooBfQAx14AaABAg.9d6x6VsmnhX9d705u0e-Ga","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyOpFubxBxooBfQAx14AaABAg.9d6x6VsmnhX9d9njcFeLt7","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Ugz8C3bWcgkEbIilboZ4AaABAg.9d6wYH0OHuY9d7luPHmdib","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwj1temLbtd0WAqNNt4AaABAg.9d6L5t23AZo9dAvmPS8PQ7","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz4ph9VYwJsO9o4c_R4AaABAg.9d6IGU0hN8C9d6MCyUF2Wd","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxBr6jJXSPUn2faefN4AaABAg.9d5rBKlZSPw9d7a_PHlMMD","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgxBr6jJXSPUn2faefN4AaABAg.9d5rBKlZSPw9d7ay7O_kwu","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyvmC7mNnVwGD5hgfh4AaABAg.9d51gx3GGMI9dL7CtTmaNK","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]