Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the Chinese Room argument that a robot can't have "understanding" is a bit odd. The only definition you could give "understanding" is the subjective human experience of being able to observe your own system and how the knowledge you have fits together. So it's more a question about consciousness, which we truly do not understand.
youtube 2016-12-17T23:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UghzSNvx5uegVXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh4c8e50QCj8ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ughs4I7KaS737ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghMDBUyZnC2FngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugji64XRryqdW3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjkwrO3Ez8d7HgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugil65uTie1NIHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UginAXPkht8ngHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggFbFrSekyybngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjSmAuFWGCX-3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]