Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wish there was a deeper discussion of concept on "agency" rather than the more general idea of self awareness or conciseness. It seems to me that the problem with AI as described is the evolution of non-intended patterns / behaviors in response to a query by an agent. That is rater different than AI setting a goal by itself for itself.
youtube AI Moral Status 2026-03-01T05:0… ♥ 32
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz94zDfwK8ufCkhopd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9G7qsLamnJB4_6U94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzOgLAKvzzCby92o554AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzSY_-MwltajS1IqJl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzDTpLKIG5w7dIIh3V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyW7rWzFsYsHYz4hiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_alcdqf4GF6eF-kB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxgJYnLfCWWI0r243B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyYmmMLneGWU__NB3N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy349RHtHaYdrgXjm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]