Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Understanding how LLM’s work isn’t very useful if you don’t understand how sentience or consciousness works… It’s funny how quickly we’ve discarded the lessons of Alan Turing, the point behind his test was that we can’t prove that another human is conscious. Therefore any entity that can pass for a human must be assumed to be conscious.
youtube AI Moral Status 2025-07-12T19:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxOudtQbBHla8RVwEZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwrwW4O9OhZldX7Hwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYamWfkkhqV1no5q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugya1O5DYn5fVO-qqA14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzmuU-hlbAlDnOx8Yx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxU--PQq38Q5JPhFBd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxLtxSV8Y2RRFvjvPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzPjH0U2xQzz2wpEqZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzSeJ3VQ95WPYp1GHZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLsKa_yBOQWvMwGjx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]