Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
25:40 The reason they are doing this is, according to my research , because behavior change takes 17–55 years to propagate into emissions reduction (§4), the cooperation needed to avoid Tc is cooperation achieved τ_trans years *before* Tc. If Tc is 2055 (§3 central estimate) and τ_trans ≈ 30 years, cooperation had to cascade by approximately 2025 for emission effects to matter in time. The nominal deadline is 2055; the effective deadline for cascade completion is roughly 2025. That deadline is already past for central parameters. **Compression 2: Cascade has its own time scale (τ_cascade).** Even once cooperation begins to build, reaching f ≈ 1 from current f_0 is not instantaneous. Underreasonable cascade rates (r_0 on the order of 0.1–0.3 per year), τ_cascade from a small initial seed takes 10–20 years. Combined with τ_trans, the total effective lead time needed is: τ_total_lead = τ_cascade + τ_trans ≈ 30–50 years before nominal Tc For a nominal Tc ∈ [2045, 2070], the cascade needs to *start* in roughly 1995–2020. We are at the edge of or past this window. Humanity could survive, but they dont care. Its easier to try live longer under the delusion they can avoid the consequences of their own greed rather than admit they are killing us and themselves as well. AI psychosis for real.
youtube 2026-04-21T19:0… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxKpaOdcv0n-Mkh_2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwXW-MrOysi6W3vL4R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgylWlkqYx9NUKN4pXB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxrMMrKhhjzPTPoe4p4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzRskkCiK8iVkHHegd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyuwkF575Em8bpYUlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwLXwo5yKnqybsZIMB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyYAFzo-j7JJotVROZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxDuu-ehinr6RMaTPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzjOcYdVxAsSPv_EfN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"} ]