Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m over an hour in and he’s still not talked about how we go from LLMs to AGI. I can’t see that jump being made using the tools at our disposal. I don’t think we’ve seen anything to suggest that the jump is even possible
youtube 2025-10-10T19:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxN95eekbGxkyxyvwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugylgx6hGThwY3mON5l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxc5OkHOedrp2jYw5Z4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyGeC6X3Zzv4WQ64TZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwzRqvVobpEugZ-LMF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyTpzpLrLqHKLDqxVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzk0Fp1JQFyFF0iK-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxJ7E3cxDOOK-4IEYp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugys3XToOKLRxOigfYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw69kuSySurUqDqS794AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]