Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I use Ai professionally and the challenge is managing the context window. The tricks used to extend the memory of Ai works ok for conversations with ChatGPT but aren’t very useful for software development and they cut into the context window anyway. If we can get to a point where Ai has a context window of 200 million tokens which is about 1000 times more than my favorite LLMs, then we will probably have something like AGI
youtube AI Governance 2025-12-04T13:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwkSrhvDteXfkKzLcF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx8iyxvz2uYqlSFHY94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgylKRXOeyiNMYDq8Xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyXuLjqnYFgNjR0qP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugwiwfo864VRJzW8gbJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyyvN5QAC-5d65cIdF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_Ugxq0wUXaIKVSCXlns14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugz9y6SSAIEwC1hAbop4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz8BU1zi6vvCR_ilFl4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy7rytCH-AQuMD8lYV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"})