Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t know why people entertain these things. We’re light years away from having cars like this. Honestly, I don’t think we should have cars like this as bad as people are. We’re still gonna be better drivers than AI. They cannot have reflexes like we have their sensors can fail just like ours can. But at the end of the day, if their sensors fail, it’s a wrap we can correct our sensor if we have enough time.
youtube 2026-04-19T00:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxjljbjwdDfMyDOKvZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4n0VViZHl_DXHue94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxUqy48KEy5A2fH1nd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlPfb5-0ufzAIp9Bp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxw76-r-JM6mmuT-yR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgywbL0e53XafB29e494AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugyi_L1V0g164RsPvXF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwVKXChKJ8cKWPrw3B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"ban","emotion":"mixed"}, {"id":"ytc_UgyjnEC8pGAd8t2e7OR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw5tVMSzUQjZsOmdNB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"})