Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So they trained it for 10 years on a data source that was already biased... ​ Literally took ONE graduate class in AI and that's the first thing they mention about training AI's. They tell you the story of trying to detect tanks in trees, where all of the pictures of tanks were taken during the day and all the ones without were taken at night, and they basically ended up making a day/night machine instead of a tank detector. ​ How the fuck can AMAZON fuck up so badly. Like aren't these people supposed to be the best and brightest and all that shit?
reddit Cross-Cultural 1539266406.0 ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_e7keifq","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_e7keda1","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_e7koo9j","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_e7irmvj","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"rdc_e7jqq89","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"} ]