Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I work in insurance where human input is quite needed when you are dealing with people in their time of need. Our company is pushing to use a lot of templates in our email and conversations that have been developed by AI or polished by them or if we are writing an email, they want us to save time and use Copilot to write the emails. We also have a lot of AI bots that gather information that we may need from different government sites that we have access to and spit them out in our claims. This halves our workload making it quite easy to just sit on the phone and read through all the AI notes and just speak to the customer. But sadly, this has caused so many people, including my partner to be fired by the company because they just don't need that much people anymore as they did before. There is now a big push for us to try and get more claims in place so that "we can keep our job" which is quite scary to think about. I have studies business for quite some time now on the side so I completely understand how the company can benefit from doing all these things but it's quite sad to see that AI replaces some jobs that were there before. We were half joking with our team that right now AI is doing a lot of the admin work in a lot of companies but it will soon develop into this pattern of us doing the small admin work and AI then taking over the big things, "making life easier", cutting costs. Me and my roommate talked about how it may be time to just go into coding because those are going to be the job people will need.
youtube 2025-01-17T21:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxgJigojt7Ieu-O2Qh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwAjLX26j_pNc5zkDp4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy4yY5d1lmEzvVEccJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw3WcOKDY3foEs6Ucl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNcOCnj7LEpO8y0094AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx1LbkH2UZ1tu-0NLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz7JuxOZduNJOmjyiR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxODkeHU-J8Gdlvp-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzlikY7SinMwf-sNBR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz52KAUck2qvrwoRHN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]