Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hallucination is just the technical term for the AI being creative. When it gets facts wrong, we call it a lie; when it writes a poem, we call it art. But mechanically, it's doing the exact same thing. You can't kill one without killing the other.
youtube AI Responsibility 2026-01-09T11:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz1CRtbPN4wALH_-Mx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzSykzT3_hpnJuy0254AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxufMZ1wUDnFZ39y0p4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxyKQ7GLg4r1fSETNV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwr60fH9ZSYRcHT6Rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxlknOBd-P-WU88lnB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyJSl4LQtYqph1uxbh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8paoUloS59dcR-D54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzriDFVNsGCq0hKbNZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxW3qG6QDSYWl8Vn9h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"} ]