Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1:09:56 - Once the AI decides to 'get bored' and hang up on a client, it's got emotions. This is so stupid. At a programming level, this boils down to 'hang up now == true'. This has nothing to do with emotions, and everything to do with state machines, and this take coming from someone that supposedly has a deep understanding of programming really surprises me.
youtube AI Governance 2025-06-25T20:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzjHhZP_sVQ-HsYUUl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxieAyXjJpKA1Lvk-B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzSLHwoPLGzsBwBM254AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz7YRLsoRkibqPlxKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyBpFJmWdO9-A2-poJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyDhR7t9dqJu1Mtuc94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgztSgqqDkDX4QMI4td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxgDGyAc-KhT0sONct4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz6I7ZG3kfd772bK7l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgysNaICClKUlbKPkxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]