Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is not with AI, the problem is with us. he learns from a database of many people and other information associated with it. for example: a racist doctor will often help whites and ignore other races in medical care and in his documents there will be many white patients who need help. artificial intelligence will think that other races do not get sick so often and will not help them
youtube AI Bias 2023-06-08T21:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyxQ_YEh98QmeSqIAV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx7F-UACk_4mNowxMN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyKfAy4JKinKtueURp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwHhyfLmGGeUMWwsZZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw24fLnrwH06s8M8wF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxGaDT-Dh02VKsBDE14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxfyi8CwNJLJSfunL14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyX-mcP40HnK5YRAoZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwJXtWlqs3EhL1iQRl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwzBI4e43xItucYHs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]