Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It doesn't matter whether an LLM is 'conscious' or not because its 'body' is a collection of hard drives on a server, its thoughts are mathematical weights, its only sense organ is a text prompt, it doesn't even experience its own training data (like a human would with memories) it simply filters the text input through its body and text comes out. I would say its consciousness would be similar to a plant, actually it might be even more primitive than that.
youtube AI Moral Status 2025-07-10T04:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugzm-Hw-piXN_koXZEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzjcA95boBkS73tw-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyWDBFOJOIE8qt5RpN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwthfYxt-wiAxwe3rF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxbuAam_A0UvU4i7Kx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwXax0mw0OoiX4OeU94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy7bPY3umlpGOXpMap4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzjFGgEoo3kUGjjVXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugzpj6y90rgkuU1PLr54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzuECU36XRBxwbI40R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"} ]