Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I studied communications, and I always imagined myself as a journalist or maybe a respected author. I do write — fantasy novels in Spanish that no one really knows about. But life took me somewhere unexpected: I ended up working for one of the biggest social media companies in the world. You almost certainly have their app on your phone. Because of an NDA I can’t name it, but I was hired to write stories, essays, letters, articles… anything that could teach their AI how to mimic human writing. At first, it was exciting. I was watching this huge project grow, line by line, and I was part of it. But slowly I realized what I was really doing: training a machine to replace me. And not just me — anyone whose work depends on words. Later, my role shifted to teaching AI how to understand people’s intentions through their chats, likes, and behavior. That’s when I understood how much privacy was being traded away, and how far this could really go. Now I work for another company that sells AI solutions to corporations: banks, restaurant chains, even PepsiCo in Latin America. And the truth is, the money is good. I live comfortably. But every day I help train models designed to shrink entire departments. In customer service or collections, for example, maybe 20% of workers will stay, while the other 80% lose their jobs to what I helped build. Sometimes I wonder if I’ll be one of the “lucky ones” who keeps working for decades, training and maintaining these systems — or worse, if I’ll become the one deciding who gets to stay and who doesn’t. It’s a strange kind of doom: I studied communications to tell stories, but instead I’m part of the story that ends with entire careers disappearing. Maybe even mine.
youtube AI Jobs 2025-09-01T01:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwF8M4lfMrDCf1E8PV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugym0dhr9iz63M-645Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwU-AwA-B9mCCbS0q14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzkBopRAzSa-d6TORJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1dAhGdLzYzZyoORF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRLJpmtM6QDKBI9pl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyHuBDKgdQ6ickTrVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxUxWU7yFeMhn4Fmsh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzRWVD47w_K6TJGXMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyY_2VbO58AZClBsoZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]