Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A.I. CSAM? That's such a rediculous idea! These are intelligent systems designed by the smarte- OH.... Grok is actively producing CSAM even after it claimed to have "fixed" the issue that was causing it to produce CSAM? Cool yeah... no regulation needed! Very cool, very smart.
youtube 2026-01-08T12:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugzq9h56PjCmj-qWi4N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"approval"}, {"id":"ytc_Ugzv4sbPmeuvviYxCNh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugx73qz8exMQH0Ca0RZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzhElJfetYfzMivpa14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyg6D7TChaZx1joAbN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz8pR_V5wxnyKSId_54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgywLaZZxbwDhz5rUl94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzQl_6JVAjd9WmxWO94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwk-bkWFX7b43pdm3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwTDOdcYYeT0jaamWR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]