Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Mixture of Experts is also a concept that predates LLMs as we know them today. That is transformer models. And the first transformer models that introduced MoE architecture were Western ones, not Chinese.
youtube AI Governance 2026-04-21T09:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugyho4YAo19NPbyQ-wt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgymriO1cDl0b7BmyVZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyaoldSKD4I6ePNLqR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfPdsytKUVZ6h6EXx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwdNrFrGyh8qNxC9s54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxpgY1u6kfTouEZPk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwTDHS0hfCuLDvaiSV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxRqHQuS1wWd95klZd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwdE30KQoVqNc3on-B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyrsX4907aE3dortX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]