Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@navis462 Even if it was awesome it wouldn't. It's not that it can't do something well. It's that it does the wrong things. I frequently have talks with non-techs about things they'd like. They have no idea of the costs of saying yes to their request. (nor should they, that's my job) If a non-tech asked an AI to build something, it might actually build what they asked for, and then we devs could sit back and watch the DB melt, when a query takes 1 hour.
youtube AI Jobs 2025-12-31T07:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzQO3hIAzw5NgfiA1x4AaABAg.AQrXkfT9sgKAS59bJaQV60","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAQtfwdkRoGM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAQuwiz-rpeK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyjHpw_SEXVR5Tfl494AaABAg.AQqjzkeexxcAR0gJ57Fh_a","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugx7Rxbb8w8jBDmshqZ4AaABAg.AQqN8waJeTjARCU5wGmdwf","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugxf80Y69Vj3qCjbP3p4AaABAg.AQnxUAhXTFWATX8HFfaTRu","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARFTh2k_W9R","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARH_aMqlPSd","responsibility":"industry_self","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz2Rzcw3jCN-maVGHV4AaABAg.AQn9qTGBpfBARON03qiapF","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_Ugx5aj_2OVBzxGvbEXp4AaABAg.AQn4n0tzWZfAQq_hWCeChW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]