Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Godlike intelligence is mathematically impossible. About 9 years ago I developed a concept of belief possibility. Rather than beliefs just varying arbitrarily (everyone having their own truth), in some domains pretty much everyone agrees. Reality exists, 2+2=4 (assuming the same units / no vectors just the ordinary understanding of it). Then there are things like politics and religion where people wildly disagree. In the middle are open but difficult to solve questions dealt with by science. What I argued is that belief possibility is objective- that the number of belief options available are what determine how much people agree or disagree on subjects. More belief options = more disagreement. 2 + 2 =4 has only one answer, but there are hundreds or thousands of religions. Information theory has a similar concept -channel capacity in the presence of noise. Channel capacity defines the maximum rate at which information can be reliably transmitted over a communication channel, considering the presence of noise which introduces uncertainty. When the noise level is too high relative to the signal, it becomes increasingly difficult to decode the transmitted information accurately, akin to the difficulty of settling on a definite answer amid a vast array of belief possibilities This has important implications for the development of "Godlike intelligence" or "Superintelligence." Namely that any AI we build is going to be subject to the same objective constraints that we are. They will face noise and exploding complexity of belief possibility. This doesn't mean they won't be smart enough to outsmart us, but that Godlike intelligence is not possible.
youtube AI Governance 2024-02-25T10:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxgyOLEKZIs8R0nVuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxrhupK7bSuaZ0e_K14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyagV03ul_3Eo8E0fR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyqAs5YSsSgsOvfugF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwwlHGnEMRnDh2xnGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz-wJU7kLRNA8klytp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz4AbaMiGc6sx33Mwx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwdFR1pfVwbEyZzCIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxAiRpfRelYxQfmcR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwzZAgX48Box5Uepih4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]