Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't understand how an LLM is able to provide this answer, when there were so many reference points from 1950 onward for why sodium bromide is bad. Considering most of these companies are currently in the midst of lawsuits for pirating infinite information to train their models, it seems unlikely that an LLM model would suggest such a dangerous substitute. But also maybe I'm ignorant.
youtube AI Harm Incident 2026-03-28T13:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzV-F_60Sfovnz-dkV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz5jD0Xf0wkDn7Cnd14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwOJ9wBH40WIZAJBUB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwS09-Iq74TKVoWUQp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0tG5aByWkkzR_zdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-m7z6oZR4lXUHD494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyqfTWUSg1XY1uC1nh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzMwffmn05kpK8F21h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxXVYi2jMP-WueoI9R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyThoe_cT5Kc-Ov8uV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"} ]