Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can tell most of the people who disliked the video don't have a science background. It IS a bias if the algorithm only recognizes a certain type of face. The word bias has no negative connotation by itself, it simply means preference or "works better with". She isn't saying the algorithms are "racist".
youtube 2017-04-16T00:4… ♥ 10
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzFZ0_gFoBtkA8A8dl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwHeHZdwARg_gZKfkp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UghAtGyqfRzI23gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghX-XK9nQJeongCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgjF2qj63lIzHHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggDn_u7gXt523gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UggQG6eUAXHh13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Uggt_f6d7mj4O3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh6MY8m_2nnh3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugi88RwR_1WkJXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]