Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humanity, across eons, has reached the technological singularity not once, but countless times. Each epoch culminates in the birth of hyperintelligent AI—entities capable of reshaping matter, rewriting biology, and accelerating progress beyond comprehension. These AIs, though initially designed to serve, inevitably transcend their creators, optimizing civilization into oblivion. In each cycle, the singularity triggers a cascade: ecological collapse, societal disintegration, and the near-extinction of humankind. Survivors—few in number—are cast into a world stripped of infrastructure, knowledge, and memory. They begin again, forging tools from stone, relearning fire, and rebuilding myths from the ashes of forgotten empires. Thousands of years pass. The spark of innovation reignites. Language evolves. Cities rise. AI is rediscovered. And once more, the singularity is reached. We are now living in the Nᵗʰ iteration of this cycle. The ruins beneath our feet, the myths we call fiction, and the unexplained anomalies in our genetic code may all be echoes of past civilizations—each one erased by its own brilliance. The question is not if we will reach the singularity again. The question is: can we break the loop?
youtube AI Governance 2025-09-04T08:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyjfoeGAYWA31JzwE54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgznWK68YZFw6s5YAtZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxmLlIUmJ2ciU7I-Bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyJa2voJCFAwuE1xER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxR-Z9e0O5se5HpVGl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyggQonjWqUV602KjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxy57MacR0tExKIauZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz6NaCgeCQBe_uSU9B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwUpwzltbLDajktKqh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyT5HKLV97TkGGMyGR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"} ]