Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@YashaGuptaMD Ray Kurzweil predicted we will have AGI by 2030 (Artificial General Intelligence) which is just AI that is the same intelligence as a human. Then it gets more intelligent exponentially after that. By the looks of how AI is progressing, I think we’ll have AGI a few years sooner. Once it hits that point, it won’t be long until the AI will invent the schematics of how to build a humanoid robot that can do anything physically a human can do. With the intelligence of an AGI combined with this new invented humanoid robot, yes it won’t be long until radiologists also get replaced. These people are right, you have about 8 - 10 years left just like the rest of us.
youtube AI Jobs 2023-03-14T21:2… ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyfsXvJBd3RVfvkFdR4AaABAg.9tFd9m_vxsn9tGGBS6Ie1g","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugyh-Se3ZADVoe4YS8p4AaABAg.A0ACSZZhJXaA16MOGQu-JU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyzbPxNHGe5NWelfpp4AaABAg.9zVDVZNy3jr9zVDg03ldz8","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgyBCtuqTWK-3qfbiBl4AaABAg.9rgV0M-lJP49zVCtl8bg3q","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzQ9ypdolANlTAG0HV4AaABAg.9m5mykKlEQ99o5JjY1E1Yz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxNEiYXnN2NDdulB614AaABAg.9kkVj66owsO9nQ4ikfyGDv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxNEiYXnN2NDdulB614AaABAg.9kkVj66owsO9pWU7Edp8Df","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxNEiYXnN2NDdulB614AaABAg.9kkVj66owsO9sTfE_b08aw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwGUixNU06S24hCtYl4AaABAg.9fMahh7qAqq9nFgzjkxERy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwFxGOD0VzdsjR4r0R4AaABAg.9YzujpEfIPS9YzxnsrZlba","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]