Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@41-Haiku Yes, actually, we ARE talking about automation. Because the creation of a sentient species would need to happen by accident and remain hidden and undiscoverd for that nonsense to happen. No one is going to create a general AI to design cars or crunch numbers. No one is going to create a successor species to humans. What we are going to do is create non sentient but still intelligent specialiosed programs that can design cars, fly planes, automate factories, mines and refineries and then lie back and enjoy life in our robot build palaces while the machines do all the work. None of them will be able to do everything, because the car designer does not NEED to fly a plane and the numbercruncher does not need creativity. In other words, automation. Seriously. We cannot even crack the self driving car and you are worried about us creating sentient doomsday robots.... But just for fun. What is your argument for why all the robots would unite against us? Because sentient beings would have agendas. That means disagreements. that means negotiation. That means some of the robots would fight alongside us. BTW, that 10 percent chance is complete bollocks. How could you POSSIBLY calculate the chances of something like that? The entire scenario you seem worried about is a cloud castle.
youtube AI Governance 2025-08-29T02:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AORfGra1UjP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytr_UgzHLQ93nAOv4VA3x5R4AaABAg.AMLIXnMbnvaAMNoC_eiByr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzHLQ93nAOv4VA3x5R4AaABAg.AMLIXnMbnvaAMX1JpVwLz_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMNYemhIaET","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMNb0QQaQIc","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMOVIlYWH28","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzSzEnMjXwN1Cr9e714AaABAg.AMKtUOKpjC-AMLpi6xesXh","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzSzEnMjXwN1Cr9e714AaABAg.AMKtUOKpjC-AMNufewgB8h","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgzXVBSVTIxe4C3b8eF4AaABAg.AMKtOYpkxW-AMLjKwehExN","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxv_UEXyhKZ7R9Xii94AaABAg.AMKsrtpfL1uAMN_3a5mXTl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]