Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not true. They only ground planes when they think the problem is systemic. Planes are rarely grounded after one crash, usually it takes at least two very similar ones. And this has nothing to do with AI research anyway. AI companies accept risk because there's no other choice. They are in a race, if anyone slows down while others don't, they gave up the potential reward without reducing the risks. The only way is for everyone to slow down at the same time and strongly cooperate on safety. But that's practically impossible when the price at the end of the race is not just a little profit, but enormous profit and unprecedented power. What you can do is to win the race by such margin that you have time to solve the alignment problem before others catch up. That was Elon's plan with OpenAI, and now with xAI.
youtube AI Governance 2025-08-30T13:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugx-qzznYwo1reEFsad4AaABAg.AMA8-Pk0B_PAMSIEx5jYF7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMAN33w51l2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCC2_IYVmC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCR6jqsD9G","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCWnuuATiY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugz5jNs56uezLOZZwbV4AaABAg.AMA1IuTRIZAAMJNat_ztFI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzOrZwZQyjwQTMXmBh4AaABAg.AMA1-19-Qz_AMGbw0V_JDN","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugwim9XQC9rU_cnMzhN4AaABAg.AM9y2mqUfvFAMB9yJeDbRx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AMSKFocRrSv","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AN3QkA82btd","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]