Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@karnubawaxNo it doesn't and no it isn't. There are physical limitations to computational capacity and energy/information density that can only be addressed to a certain extent. I've seen this kind of statement from people who don't understand the computer science and physics behind modern heuristics and "ai". Its made from a false expectation that because a computer does one thing *super* fast (shifting 1's and 0's in patterns) it will exponentially grow the moment it is "self-aware". That isn't how anything in this universe works at all. The number of "computations" a single organic cell does in a single second dwarfs most super computers chewing on a single problem for an *hour*. Not even the green goo this planet is covered in works that way. This is not the singularity you are looking for. Move along. Like Carl Sagan said in 'A Pale Blue Dot': "In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves." It remains as true today as it was when Voyager 1 captured that image back on February 14th, 1990.
youtube AI Governance 2024-03-03T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxB574rBySRm4oTMQF4AaABAg.9zfgKG2RkJ2A0Xo5qVUnZe","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxB574rBySRm4oTMQF4AaABAg.9zfgKG2RkJ2A0Xsn4t8Asc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyBKWBhr2qTMpLG9SB4AaABAg.9zeFzPEiHIeA0PwDKoJadN","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx2mYOj26vPVHW4fMh4AaABAg.9zd4kg7Mz2u9zzPAnRBAcF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyEOVXYwXrVt5GaOtB4AaABAg.9zcxSIoKIcfA9Lum0IMUTK","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgzA79vbzrOJDzYIH2d4AaABAg.9zbpryFxWUsA-JixUqZ0GS","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugw4RZK-W3wfUA2VZSJ4AaABAg.9zbdk8QeDRQA-kzrHHcdkO","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgwU8wjYrsaJ_1xotXd4AaABAg.9zbYUZP1OzpA0XpX4sJ7x4","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_Ugx9HE811khRGDUbCTp4AaABAg.9zbOhZzwGs29zrlrIavz98","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zm38Fa02mw","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]