Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sabine, you lost me at hallucinations. it isn't a bug, is a feature. Humans create and invent answers all the time. it's part of the process of thinking. LLMs will not achieve AGI for other reasons.
youtube 2026-03-28T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwyIXLsWvWhzS-2YFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxIRV7jvgGKNfespXx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugybvbfm5hik3wd8rl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyX2HK0718tPbbd3914AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxr7adwK8n_laIwix94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwYumthFjcbUSaArE54AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgycjZ4gx7j2AaFWGw94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw396-oN8fUs6WaL4V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzdpnfWfbt5O_LT8ZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyabO2DqGU34ToWcDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]