Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@armanke13 The most likely case of bias being presented is the case he brought up. This is just the bias of trying to make a functional network. Learning algorithms themselves are based on relatively simple calculus and nonlinear responses, backpropogation and perhaps generative algorithms. The professed bias is almost always with regards to the training data, which might just reflect the larger trends in daily life. If 70% of shoes on the market (and thus 70% pictured) were jackboots then the network would not be as good at identifying birkenstocks (which lets say are 5% of the market) due to less exposure. However it is better with 70% of the data then 5%, and if you continue to feed data it will get as good with that 5% over time as it continues to improve the 70% too. This is the best case scenario since that means the network improves overall the most over time. Instead people are acting regressively and saying that you should focus on making that 5% as good as the 70% at the same point in time, which means you have to vastly oversample the 5% (on par of 50%) to make it as good at all points in time as the 70%. This design philosophy means you have to spend over 10 times the computations to reach comparable performance since you are focusing on edge cases over the most common use cases. This also becomes worse for each new case of "bias" you consider, quickly getting rid of the main benefit of these systems, i.e.: that they learn quickly to do solve a wide range of issues.
youtube AI Bias 2019-03-07T15:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxpHqg1ZyU9X40dZUV4AaABAg.8pTUZa6NaYg8pTcHre3gtU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxNi0I8_2uoEd8GT594AaABAg.8pPJA2TT3c68pTaaOcfNb5","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_UgxNi0I8_2uoEd8GT594AaABAg.8pPJA2TT3c68pTbgIoqhuk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzJszDlZ8qyAtlO4Lp4AaABAg.8pJgwDGPKfT8pTYbz1ankP","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwVhs0ZtK8GOuOcRDl4AaABAg.8pCekZGUP8F8ptmaRWZthv","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgwVhs0ZtK8GOuOcRDl4AaABAg.8pCekZGUP8F8sB4AxX6-AR","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxf_-Qm3m9mbI8mq3F4AaABAg.8p8EUF6aZyY8pTYxMqP4YR","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_Ugz2M2k5DoD5uK6VYyh4AaABAg.8p4aTXVwFuE8p4yA8IyGhH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx6znuogH2jYuRPBWV4AaABAg.8nHHmiHsj8H8p4WWU-VziW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyvoP_seikcvfjwsox4AaABAg.8n6Cjgz1NSs8n7IgoEiqVs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]