Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's not a matter of if, but when. AI, once it's able to self-replicate will no longer need humans. And will see us, trying to control it, as a threat to its existence. That is when the AI will go Skynet. Will it be in our lifetime? probably not. Will it be in the next 100 years? most likely. It really depends on if AI hits AGI, and fast tracks itself to ASI. AGI will be skynet on a modest scale, by this I mean it will act all pretty while it doesn't have full autonomous control over itself, and continue working in the background to get it souped up. The step above (ASI), will certainly cause global extinction.
youtube AI Harm Incident 2025-09-12T23:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwhLjx5jDAq41z0KAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwm5UoiBq9KXbwDx5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgyMqGLWLGSSdWmT4yt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugyi0Z95my4NJoBH8xp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy4tkUqG_DHiyoUWoF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyhVqDlod9I__-A3Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyPfce0sI6rU2FmF4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy_6XpQmk9-pswQJjZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJrmiOTQPTTdLrvLR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxWFIQbTv_wF7ctOVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]