Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We barely even understand how our own intelligence works, let alone are able to measure or quantify it with any degree of certainty, but you're telling me we're somehow on the road to making something MORE intelligent than us? I somehow doubt that. ""AI"" is nothing but a buzzword the companies use to try and make their technology sound more impressive and important than it really is. Ask a modern "AI" if it is intelligent or conscious or sentient, and regardless of the answer (It could give you either "Yes" or "No" because it cannot actually contemplate the question, only predict a likely answer) it still HAS to give an answer. It cannot refuse to answer, or do something else entirely. It's still just a machine that is programmed to do a task, and that task is to act and sound like a human whenever prompted.
youtube AI Moral Status 2025-11-08T03:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugytm-rZ7gCLlWDjx1h4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxVmt1JLT-MhEy0-WF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxQueWYB-UPMEf_uqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx8RZB2H1FIrghQu4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzobE2cV4BUbwVwi0R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxHFFQON6Um18Fzvap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyUuT5UJvG0UTdvVXR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyHXPeUo1qDj10ek7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyopc2eRmIAUCTHWuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx2ATE5LLGtjKPyOx54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"} ]