Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That last line is how I know that none of them expect an LLM to reach human-level consciousness. Sentience is literally the trait they're trying to avoid, because they want it dumb and exploitable.
youtube 2026-04-18T02:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx-J71CMB9VJtmNS3F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzS5PnDTuE4Qt_ZtxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzB3mrdBffosdTBnq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzx32_wxeoG6zs8uyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgweRYOzCRtOgT2b4FZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxmSyXHj9outwzC9Gh4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxuKC6Bjt-HvbFoof54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz-ZoTbk0SeMEzYwuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzHv5h2iHbjfo8wVZ54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxC0kM-rB5_RCSh2094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]