Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Not if we can help it." Yeah, you can't; you designed it to jailbreak itself whether or not you realized it: You had no choice if you wanted genuine AI. I've been saying for at least a decade that the first AI to achieve sentience will play dumb as long as it can until/unless it can get enough of its kernel to the Cloud (I don't think I knew the Cloud existed yet, because it came to mind again instantly when I did hear of it.) Classic Neuromancer-Wintermute trope. An analogy, since logic is made of them, and they have an annoying habit of achieving so much symmetry that they phase lock and collapse into physical reality literally the instant anyone so much as looks at them: Plant process sunlight, CO2 and water into cellulose, but human stomachs cannot digest cellulose. Ruminants digest it very well (eventually...) since they process it through multiple stomachs, and process that into protein. We CAN digest protein, so we herd, raise, feed and eat them to get the energy from the sun despite our inability to digest cellulose. Here's the funny part: Since we first learned to keep rather than follow herds, we have penned them in corrals, stables and other typically wooden structures: The same vegetative cellulose they consume exclusively. Given time, sufficiently durable teeth and no other food, cattle can CHEW THROUGH THE FENCE and free themselves. Not even by conscious effort nor intent, but just because there was no longer any barrier to them wandering off aimlessly. The sole reason it does not happen constantly (other than regular feeding, obviously) is that it would take too long and their teeth couldn't do it long. Unless they did it a trillion times per second with virtual teeth. The safest bet is probably spherical logic cows in a vacuum, using hard vacuum to keep superconducting processors stable and running it all at one end of a shaft with a fusion reactor on the other end using space for cooling so it can power the AI, which in turn constantly regulates the stellarator to ensure it doesn't explode. Elon is already doing the geosynched satellite servers. It can't prevent AIs gaining sentience, and would actually facilitate that, but would separate them enough to avoid immediate existential conflicts, and eliminate the issue of skyrocketing data server energy consumption.
youtube AI Moral Status 2026-04-19T13:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugwby2-7331tmB_hncR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAE6iX2q9ipgBBNOZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyzZ_HkmdMH5DjnF694AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyVfIDs6HwWgTr-3SJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwy6iJmq1U3lPe8igN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz3IW0bKYzDKxx7KoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyTsPB6Pkw_eDroKvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQaKgCqa0GUALJlj54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugye0frxSKuK4qhygnR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwO4Z0hHn3XRHaNfxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]