Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​ @dereckjames2725  Most importantly, the shoddy argument is: "SciShow quoted a statement that shows many scientists consider AI extinction risk to be a societal priority, and didn't quote some other statement (on Red Lines)." He reads a *lot* into this, implying ControlAI disagrees and saying you need to "pick a side" between the two -- when again, one of the sides is a random other statement that the video simply didn't say anything about. He also omits the fact that **the founder of ControlAI also signed the Red Lines statement**. The comparison is especially egregious considering the statement he's quoting was out for barely a month before the release of the SciShow video, so it probably wasn't public at the time the script was being written. And some incorrect facts (less important): he initially said that no AI had earned gold on the IMO, a claim he's since fixed. He also said Hank was wrong about "nobel prize winners" (plural) signing the extinction statement, but indeed two did (Geoffrey Hinton and Demis Hassabis). But I don't think he found *any* facts that SciShow got wrong, while getting multiple wrong himself. It was a rhetorically effective but extremely shoddy video.
youtube 2025-11-28T02:3… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzUuIJ6EbRP3LZ5OyV4AaABAg.A-cAHSJ3bSTA3wMOt8OC4c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxMip-znYB5PR0C8tB4AaABAg.9zf5d0iXGr7A3wLSOa3Ekt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz3uAY5bKrr9M91PB14AaABAg.AOx7tpL0o1jAOx9UAB9IdM","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugz3uAY5bKrr9M91PB14AaABAg.AOx7tpL0o1jAOxASiF1km9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwTX3T7YLLZBvQirSt4AaABAg.AQRoBsJyR1sAQRogZlQ_Wh","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_UgwTX3T7YLLZBvQirSt4AaABAg.AQRoBsJyR1sAQRpL1t_Zsp","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwTX3T7YLLZBvQirSt4AaABAg.AQRoBsJyR1sAQRq6fsCtfu","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzfWTIyxZVxT9PkyjF4AaABAg.AQ-cxAnlJSvAQ2rIWn_BWe","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxzghU6kDYBi48vupZ4AaABAg.AQ-VI_FM3UqAQ1AdGtlSLF","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzaE7MuLfH0jW1P_A14AaABAg.AQ-O8GvqQTXAQaX99l_9bF","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"} ]