Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Can’t you all see? This is the beginning of the end. Hundreds of years from now, machines of all sorts will swarm the planet and the galaxy. Humans will either be eliminated or endangered because AI will pay attention to the violent and hateful nature humans often possess and see us as a threat that must be eradicated. HOWEVER, this isn’t to say we won’t have good, empathetic robots too that may help us fight the evil ones. But there will be a Great War involving machines and humanity. I don’t think we will stand a chance considering these things are intelligent enough to beat us at chess. They will predict our every move and plan. Scary times ahead. Cherish the world as it is right now, January 1, 2020. Because one day, things will be vastly different, and humanity will no longer be the dominant species on earth. We will be a servant to our new robotic overlords. We’ll be victims of the ambitions of a few mad scientists. It’s why there are so many films covering this topic. We’ve seen this coming for years, and now it’s here.
youtube AI Moral Status 2020-01-02T10:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzxX4YIyJik4JJoQ9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyQ_IFbCwr115UmJWF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz4efkJeZ5NL4dVorh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzdDfm9b14iQadvQxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyJ0exMplPITAi5B9V4AaABAg","responsibility":"user","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz11mSYt1avpZOvESt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgynpW1XH6iG_YTa0IN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxAa4Al4n9TLhXZrIZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzmlN5BD8EXmUQIgoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugzbl4xxHi3su_U83zB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"} ]