Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The reason hallucinations have to stay is because all the base models have to stay. Evolution doesn't get rid of parts of your brain, it builds on top of it. And the more times we rewrite the ai, and stack those layers and layers of learning on top of eachother and then one layer can talk to another layer unintentionally it acts like actual synapses. And may be the key to conscious AI.
youtube AI Moral Status 2026-01-24T04:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw-WvKicIaeOqH3NrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwb67oLlWURSZ5mLLZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxN6Y34g4qQUWhrgEZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgzWlBmesWeTaDRGa-t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwTPDRmNdHdb0bwnmB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugye8OqqTc6UlBWPeip4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzfe0GExYy_1wD1D1x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwY-GXPB0CdL5BftsZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"resignation"}, {"id":"ytc_Ugwmr4AgKFk-6KmQdi14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwP95YfqoXwF4qq5Gt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]