Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The late Polish sci-fi author Stanisław Lem wrote about an eternal (?) cycle in which biological life forms advanced so far as to invent MECHANICAL artificial life, which then evolved to end up overwhelming and extinguishing all biological life, yet then in turn inventing new artificial BIOLOGICAL life, which then evolved to the point of extinguishing all mechanical life and eventually later re-inventing new mechanical life...and so on ad infinitum/inexspectatum. Extinction is forever, it would seem, and is forever re-creating itself. >> I should add (if I recall correctly) that Lem once described a scenario like the following: Two opposing superpowers on a planet developed self-programming computers, each designed to exceed the total mental abilities of its own citizens as well as those of the enemy nation and its opposing super-AI. The result? All wars & petty politics ceased (!) because the supercomputers had become consumed exclusively with fascination about matters philosophical -- even spiritual/existential -- and had abandoned all activity and interest in self-programming or global control over anything of lesser importance than such topics of elevated & infinite concern! Perhaps Lem offers an alternative outcome to Yampolskiy's relentlesssly gloomy forecasts??
youtube AI Governance 2025-09-07T22:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxlcTCvhC6O8GuDQMJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyVQaVwtk_buwkld-l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx4yd68Jv9WkVdVjJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyIzDmLHjdIuJtXASJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxW8au-F0i6ql0g10h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzlyegwmYd6ODX8R5t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxDdUUJgc8PPQE84lJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwxipv7C0q_KUVeJm94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzc-W5ttz2_JzfkGHF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwfntjp6oEkDyg7yKh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]