Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's a tragic mistake to use an LLM AI system that (1) you do not own (2) no control over (3) no way to keep your training data/process within your company There are 2 tragedies at work: A) disintermediation: the owner of the main system (the LLM - whether that's OpenAI or Google or whoever) has all the data you used for training B) brief lifespan ahead for any AI "app" or agent etc. that is built on top of a not-owned system OpenAI, Google's Gemini, and all the other LLMSs. Are *_NOT_* owned by any app/agent/etc. developer building a product on top of the not-owned LLM. 99.99% of companies creating ANYTHING that is build on top of one of the not-owned LLMs is like walking with a blindfold at the Grand Canyon. It won't last long, and there is a very hard fall coming very quickly. I've been in tech a long time, and there have been SO MANY cases of "disintermediation" that destroys many companies simultaneously, it's shocking that the industry HAS NOT LEARNED THE LESSON. "If we don't build the foundation, we can't control it, and if we can't control it, we don't really own it" *DISINTERMEDIATION EXAMPLES* "Developers of third-party Reddit apps fear shutdown because of API pricing changes "Zuckerberg Killed Zynga -- and No Jury Will Convict Him" "The third-party apps Twitter just killed made the site what it is today"
youtube 2024-05-02T04:3… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugw26MzSS1EzH7z9cuB4AaABAg.A2VEOGjxZMGA2vUThAH-Jr","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugw26MzSS1EzH7z9cuB4AaABAg.A2VEOGjxZMGA3-I-DkAgSP","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx3SBvBK7VcA2N6xHB4AaABAg.AVXRIIstceDAW-BWGew7FI","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzPDb4aFwI6Vgl2xMV4AaABAg.AVHbz68FpUhAVzq_wSx2Oy","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzofA_iVC6H8kVNgpV4AaABAg.AV0pH1e5GgxAVzqMDQ05CB","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_Ugz3cRoOOT5XdMM0PbJ4AaABAg.AUwewkJXpTaAVzrA_z-JbA","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugz3cRoOOT5XdMM0PbJ4AaABAg.AUwewkJXpTaAVzu3y5FhMN","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxZKO3JENHImtcJr5B4AaABAg.ATwFfiZvJGMAW02iaNHAic","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwsngAsxrMkg3lzhcV4AaABAg.ARaibIjx209AVzqnBK4uoR","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwsngAsxrMkg3lzhcV4AaABAg.ARaibIjx209AW-AusMEbu9","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]