Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We put a team on preventing hallucinations. Once they showed some immediate success we realized the resulting models were two generations less successful at reasoning in things like advanced mathematics, so we dissolved the team. No, we actually dissolved them. In sulfuric acid. Our marketing AI sold them as tomato fertilizer.
youtube AI Moral Status 2025-10-31T13:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzP70ix2PKtiHVcbWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzGAl1hr4cKdxQ5ez54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugye_52wf7-yvnbmb814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw_HCArOhYX7qErAN54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeVF3QOmvsKgDvEel4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzKrVVcaRxCW5jxgoB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw80i-COGpIL6xpnEd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxMbtsrZZJWmzZn7654AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugyf_JcKywvlI9mqp_h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwrnJdWRTx_ANa3BnR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]