Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Actually you'll never have to "dive in and fix" anything, because you'll just ask a future even better model to do it. Software engineering is going to be entirely outcome driven. No more nuts and bolts. Just using the tool and seeing if it meets your users needs and if not prompting a feature modification. Bug observed, noted, automatically fixed (and eventually confirmed by you personally or by automatic testing).
youtube Viral AI Reaction 2026-03-05T17:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw2m5hfiLAzf-MkpfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwq3AmFbJvmKB4MeFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgypkzhPhGC3xlQ52UV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAspiNyAKZ9_id1194AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyH9enfCTHkOo9Pm6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuSWKOR2-0IcmCpxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyUBo61WycAxacyAsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXsvrDaRMDIehoh9N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzgXLYuBQQHuQirxiF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxHZfN5AE64zUTtsvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"} ]