Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well this has aged like milk. The real way it goes down. Your CEO is highly influenced by VCs and AI hype bros on X formerly twitter. He grossly overestimates the abilities of AI. He raises a huge round to get on the hype train. This would seem like a good thing for you but he doesn’t realize that increased capex really means paying for stupid expensive apis and/or h100s, your salary remains mostly the same but you get a nice little boost to your stock compensation from the appreciation. Your company’s customers hate the AI product your CEO made you shit out and ACV stays flat. OpenAI releases chat-gpt 5 but calls it 4.5 to further delay facing the music that Sam Altman doesn’t really have any grasp of sigmoidal curves or scaling laws. The market tanks as the MAG 7 realize that they are losing investor interest and need to cut capex. NVIDIA nose dives. If you’re lucky you guys went the api route and can cut your sunk costs, sadly some of the team has to go as you are actively churning customers by promoting the ai tool nobody asked for. If your CEO bought h100s you have some rapidly depreciating inventory. He knows he has a major L coming as the h100s are selling for 0.05 on the dollar. To offset the book loss he takes an axe to 3 quarters of the team. Your former employer changes their domain back from .ai to .com. Your stock is worth nothing now as your CEO raised money with an insane liquidation preference that he’ll never raise at again. Good thing Google just made a major breakthrough in quantum computing. The end!
youtube 2025-03-07T04:2… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzZsAZQz7bezKdLkKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxF4Y9eC3RFmtYr7wl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgxOSg69f0rlCOKGWEd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9vNICtEuz-uMIFwZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYXcXeSvoLC2WQGhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxTqTcUVZV7nJb81Zd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwV3weC3AHbbAcWJtV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgycKCKjPTDJ-E5PKLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugynitq9GOus_AZRmcJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw_IEZN52_dXmhxbIx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]