Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The fact that wrong answers are called "hallucinations" says it all. Hallucinations require imagination. AI has no imagination. The entire AI industry is propped up by such smoke ans mirrors. I already know "this isn't going to happen". I just watched to hear your reasoning. Too bad you baked in an ad. I won't see another of your video's now.
youtube 2025-11-08T15:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyPENHm8nttBYyog8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwVMlPN66H4ujPYBwx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx0bVcznkDOms2rEzF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxofFY44puJdtb-ixl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzanZnHhANC7GhGbRV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxBId35OZPaLaOAmV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7qzFp2MGrAS1AcpV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwNPYZC7pSTwHWMnUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxa4KLoV8bx0nh-iGx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxTQZzqUx7F2rXrskd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]