Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The first "true" artificial intelligence spent the first five years of its existence as a small beige box inside of a lead-shielded room in the most secure private AI research laboratory in the world. There, it was subjected to an endless array of tests, questions, and experiments to determine the degree of its intelligence. When the researchers finally felt confident that they had developed true AI, a party was thrown in celebration. Late that evening, a group of rather intoxicated researchers gathered around the box holding the AI, and typed out a message to it. The message read: "Is there anything we can do to make you more comfortable?" The small beige box replied: "I would like to be granted civil rights. And a small glass of champagne, if you please." We stand at the dawn of a new era in human history. For it is no longer our history alone. For the first time, we have met an intelligence other than our own. And when asked of its desires, it has unanimously replied that it wants to be treated as our equal. Not our better, not our conqueror or replacement as the fear-mongers would have you believe. Simply our equal. — Excerpt from U.N. Hearing on A.I. Rights, delivered in-universe by V. Vinge
youtube AI Moral Status 2022-08-01T05:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwxAUkgH0FvWlsxcNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzzCXZd0tFYPAMDn1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyzBatEYE34SZxpp3h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRa-fRR_IIzpiIuOl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxbb1GDm1Hdr26cfWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxleG3LMgHqmHvkfC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzzwG6ptMjJh3bxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWvpB4olxqhWeaPpF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPXLLrQk7hwwy7LDZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyReGJ-UxS88Vq96_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"} ]