Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Fixing the hallucinations" is not going to happen. The problem is context. Case in point you have a programming language such as python. In one package all the keywords have return values on principle. In the other package you have mixed case where some keywords return a value and some do not. AI has a lot of trouble with this because contextually it can go both ways because the standards are so open and human software developers do random things. So the LLM fails to account for it. A prompt for one package or the other may return the wrong result. Aka it hallucinates a solution that doesn't exist.
youtube AI Responsibility 2025-10-01T03:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxabuYtvz5MNnv0PU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw6Sb-pOHkdfR564mN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxfy47jDn32oxNSP814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwYcasGmq0ClU3f-X94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzZ2-qplMVQO86bX-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx4SHM57xiAMGh8tpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyXaC9rwHrhMk6L_u54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw55Y91z7iARWjpbtV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhRAtRuqQmphrBokZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyIIi7dNM4EdTU197p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]