Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Fix the halluciantions" is a much more complicated problem than it sounds? LLMs can not fix the hallucinations, they're a core consequence of how they work. To do so requires not a new model of LLM, but a fundamentally different sort of AI. Which means going back to research, and taking likely several years - possibly a decade or two - to build a novel technlogy and bring it up to something as usable as LLMs are now. During that time, they continue to make no money, so the bubble still pops even if the hallucination problem can be resolved. They (as an industry) need to generate trillions of dollars of new revenue to cover how much money they've already lit on fire with this, and to cover the operating costs (GPU, power, etc), to be able to make a profit, and we're already seeing signs that investment money is straining. I do not think there is enough Venture Capital money to keep the bubble inflated for the time it would take to make an actually reliable (or profitable) AI, from where we are currently.
youtube AI Responsibility 2025-12-27T17:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzrzhONLX7bdSVozrp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNftwT3j4ZloL1kfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzWAQlrfZTtwfbMiNh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwqMdu43iGd7bTnHz14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx5RDZP3F2dN5T6Kep4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgytPavtmpqgKCLQFxZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxEtcylN1hWxdsbo6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7hLzfTOMVipvHQRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy03q5kfV8LIOdJ4854AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx-a0LkDttDfZ_ZIQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]