Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In college, we read an article that explained facial recognition made in the US couldn’t distinguish Asian people, basically thinking they all look the same. A facial reconciliation software in China had the same problem, but for white people. Human bias makes biased AI.
youtube 2023-09-23T18:1… ♥ 12
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxX-PlruuodVHD2f4F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx7nStdvLzs-2dbm-V4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwo8uGTyrUDIiTD9yt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz439HDOEytBjs2z3F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwlgcJpqmPUPXdwdyB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxT3gX36oDjBXcH55N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz9aAgwswWc0toL7lt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyEB6nyUFl5YAO5uMF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiYQrDpCmgC-qL_x14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4vUmkMnna53usMzN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"} ]