Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think where it gets really tricky is when you have AI models not necessarily cloned from a VA's voice, but where the VA could have been the alternative. Like Troy Baker doesn't have _an_ iconic voice. His value is that he can do _tons_ of iconic voices tailored to what the licensors want. AI is more of a threat in that context. Not the, "I want it to sound like Troy Baker," context but in the, "I want it to sound like a middle aged guy who's seen some shit but has settled into a monotonous life," context. Like what if you use an AI that was never trained on Troy Baker's voices, not based off any of his existing voices, but is still a voice that you might hire Troy Baker to do? Troy Baker still loses work, but you didn't really take anything that belonged to him to make the voice yet.
reddit AI Jobs 1722951543.0 ♥ 79
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_lgsb6ke","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_lgs1aq7","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"rdc_lgrq5h1","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"rdc_lgs50ll","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"rdc_lgtcdr3","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]