Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An AI detecting hallucinations should not be that difficult. Probably a non-AI would do that best. On the other hand: imagine a human would have digested 90% of the world knowledge. Would this human also start to hallucinate?
youtube AI Responsibility 2025-11-23T21:5… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxBaFgxNGd9xVDx6jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwjYiNcKwF3YL_npIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugzz45IVFqYabVsOfet4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzIyDnOmzgPCiKgqod4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyrHe5s1BS12R97yqZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwTm35G54OzZeyua3Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyBIeo4W6Nc20GZdhd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyPhfLmHSJhgsl2mwp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwJGtBQ5VpgiRKhJSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyEBi2Bs0YivanUP-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]