Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I asked Copilot two C# related questions this week. It got both wrong. In both cases I told it that it was wrong and after “thinking” for a considerable amount of time it confirmed its error. I see no point in asking it any more questions if I have to constantly correct it.
youtube AI Jobs 2026-02-07T21:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzMYfkpeeS55gfpahJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaExbrcIT889DYwSJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwTRyy6W1ZBfcRy_Fh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyG-MYm6Xr-vbsbkFl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxa9YJPPZwccoYa79R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzikaaVWgl2hLB68v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzveMPuS66wXye9wfF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugx6yHwU6KJxk-PzHPl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx4L_KGpiBUPpCtw7h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugwsp_i36AoyZySnaDN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"} ]