Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dude.... A product of computers, an actual fuggin caveat to LITERALLY ANYTHING COMPUTER, is "Garbage in, garbage out." A computer does nothing but what it THINKS you tell it to do. If I want my computer to tell me to love me like my momma never did.... Well, unless someone PROGRAMMED it to do that, it WILL NOT do that. It's like if you wanted a farmer to negotiate with Kim Jong Un to tell him to enroll in an investment account with his bank in order to receive a 2% interest rebate. THE FARMER IS PROGRAMMED TO FARM. A BANKER IS PROGRAMMED TO BANK. THIS DOLL is programmed to simulate distress. It did not malfunction. It did not become sentient. It DID NOT feel pain.... You would need to give it it's own pain receptors and program PAIN.EXE into each individual "nerve ending" and then program a response into it. People thinking we're gonna be lost or genocided to AI? PLEEAASSESESS
youtube Viral AI Reaction 2025-05-28T21:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgyaA6yVk2aruJUhfed4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwpNB1w-XqlLHbA6qN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugz2OyBbi0wVSD7PGNN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzVCg-2OFbnBC-w3wZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzTH1sE33Fxzbq2Ked4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyfzuhFCaAVI1bi1zZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgytnNFRwGdmUYHu_7x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwwcMRzUFvNWqI4muV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzaOIttPuAMnxYtld14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwHyIDILgK1bwQ9WuB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]