Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We can all take action at our own scale. For example, you can talk about these issues with those around you, share videos and texts that resonate with you with your friends and family, and help raise awareness. You can sign the Statement on Superintelligence from the Future of Life Institute, which is easy to find online, that calls for a global ban on the development of superintelligence. You can support organizations like PauseAI or ControlAI, which advocate against the development of systems that surpass human intelligence. You can contact your elected officials and let them know that the development of artificial intelligence worries you and that you want this industry to be strictly regulated. As AI impacts our lives and society (with job losses, for example), the debate over regulating this technology is becoming increasingly important in the public arena, and we can all support, during elections and otherwise, politicians who advocate for the regulation of the industry. For example, one can support politicians who oppose the establishment of data centers in the territories they represent. In short, the more people worry and are aware of the dangers, the easier it will be to obstruct the industry and the uncontrolled development of this technology.
youtube AI Moral Status 2025-12-11T12:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzenVi1k1_kJsQan6d4AaABAg.AQ_KwSJC6KKAQ_cUTkjRBE","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgzenVi1k1_kJsQan6d4AaABAg.AQ_KwSJC6KKAQar51L3Yy2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugx0xp3t361lvGcq99J4AaABAg.AQ_KWbBpsHwAQ_hc9k69kK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx0xp3t361lvGcq99J4AaABAg.AQ_KWbBpsHwAQaUSP-Awr2","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx0xp3t361lvGcq99J4AaABAg.AQ_KWbBpsHwAQaY1hSrQ6p","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx0xp3t361lvGcq99J4AaABAg.AQ_KWbBpsHwAQg_oYkHVwF","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxRIebW4QX-Nl0Ou-d4AaABAg.AQ_HqH4_OP9AQaNUb20A_g","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgzAoYnNRBIymVpqHOV4AaABAg.AQ_HK1c4JDdAQ_IEwfKfp-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzAoYnNRBIymVpqHOV4AaABAg.AQ_HK1c4JDdAQ_INTZRKr1","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzVCJzkZZemBlomJ6x4AaABAg.AQ_GaFb1rxVAQatsj6vtZ7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]