Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I read a 1930's Sci fi where people lived in perpetual idleness because they woul buy a robot and it would work for them. I could never get past that first part. Bringing it to the modern day where musk is convincing gullible fools that they should buy one of his cars and it will just uber around all day every day earning them money. I have yet to see an interviewer asking this blatantly obvious question.
youtube AI Jobs 2025-08-29T00:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxfUCzQtCQRNuHMqqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzf92tMrmoevxlmutd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwqicrby3Jv1xCKV6J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx240H4eZHVNMl9M0Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzxpL-nyL-wFHP3eGh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyM1OyxWAGYLCh7Lu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy99wLr6kEsDOuzdW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxDL2pwXKi0hd-ip4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxh6eOaMc7b2oXQIqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwDkrv32lY_3ftD8kB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]