Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@gaarakabuto1 yeah I think automation is a big part of it. Making things more efficient, able to do more, and produce more sounds great until you realize the end goal is a world where no one does anything (which can't actually exist due to scarcity and limits of resources), so it's also accelerating everything towards an end of some kind. I think the larger issue is that you can imagine making a tool for everything, to do everything, but the more you do that the more consequences happen, the bigger they tend to be, and what becomes of the original person who outsources everything, even their own thoughts, sounds more like what happens to endosymbionts than a new way of life. Ultimately I think it reveals that the human condition exists within this arbitrary habitable zone of stuff that we might stumble out of, we should strive to be better but we should not fly blindly and ask ourselves what we really want with the consequences before we do what we think we want, because what if what we really want deep down is to literally be brain dead. Governments will catch up if the world lets them, but I am worried this is bigger than what is normal and more like what has always been.
youtube AI Jobs 2026-02-06T15:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgwolAoOxncSLwRXjrh4AaABAg.ASqreByEKOxASqtDLsxvsr","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwolAoOxncSLwRXjrh4AaABAg.ASqreByEKOxASqxftXOxMd","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"ytr_Ugx1bYYI-BNKq2AuArF4AaABAg.ASqke1_QgwbASrq94sfNJr","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx1bYYI-BNKq2AuArF4AaABAg.ASqke1_QgwbASrsQ4vYfaB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxKZ0jMuNuZtrqDbHh4AaABAg.ASqjavvJKBQASrckiXzThn","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugwo_BpW2bx1Pi9vsdd4AaABAg.ASqioYITAfEAStJo77755L","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugwo_BpW2bx1Pi9vsdd4AaABAg.ASqioYITAfEAStTUf7mvNC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugwo_BpW2bx1Pi9vsdd4AaABAg.ASqioYITAfEAStUM7_FSDY","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgwNY1FKCrKg4nJyYEh4AaABAg.ASqgiGRZHpeASqo1OEKMBn","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwNY1FKCrKg4nJyYEh4AaABAg.ASqgiGRZHpeASrcmy3t9Wo","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"} ]