Sunday, 9 June 2019

Murray Gell-Mann on Complexity (2)



i) Introduction
ii) Additional Information
iii) Simplicity and Complexity
iv) Complexity ≠ (Strong) Emergence
v) The Autonomy of Higher Levels

vi) Higher-level Laws

[The short biographical introduction which opens this piece is a copy-and-paste from Part (1) – 'Murray Gell-Mann on Reductionism'.]

Murray Gell-Mann died on the 24th of May, 2019.

In 1964 Gell-Mann postulated the existence of quarks. (The name was coined by Gell-Mann himself and it's a reference to the novel Finnegans Wake, by James Joyce.) Quarks, antiquarks and gluons were seen to be the underlying elementary elements of neutrons and protons (as well as other hadrons). Gell-Mann was then awarded a Nobel Prize in Physics in 1969 for his contributions and discoveries in the classification of elementary particles at the nuclear level.

More relevantly to this piece. In 1984 Gell-Mann was one of several co-founders of the Santa Fe Institute - a research institute in New Mexico. Its job is to study complex systems and advance the cause of interdisciplinary studies of complexity theory.

Gell-Mann wrote a popular science book about physics and complexity science, The Quark and the Jaguar: Adventures in the Simple and the Complex, in 1994. Many of the quotes in this piece come from that book.

**************************************

The following words of Lee Smolin (an American theoretical physicist) sum up both Murray Gell-Mann's work and the man himself. (At least as they are relevant to this piece.) Firstly he explains Gell-Mann's work:

[P]hysics needs a new direction, and the direction should have something to do with the study of complex systems rather than with the kind of physics [Murray Gell-Mann] did most of his life.”

Then Smolin continues with a few words on Gell-Mann himself:

The fact that after spending a life focused on studying the most elementary things in nature Murray can turn around and say that now what's important is the study of complex systems is a great inspiration, and also a great tribute to him.”

Of course all the above is hardly a philosophical or scientific acount of the need to move from the “elementary” to the “complex”. However, it does hint at the importance of gaining a broader picture of nature (or the universe). And that's what both Smolin himself and Gell-Mann realised. (In Smolin's own case, he moved from theoretical physics to adding cosmology and philosophy to his repertoire.)

Despite that, surely it can't be said that “what's important is the study of complex systems”. That's simply to reverse the “reductionist hierarchy”. Complex systems are simply part of the picture: not the most important part of the picture. Indeed it seems a little naïve to reverse that previous ostensible hierarchy with a new one. 

(Lee Smolin's take on Gell-Mann isn't surprising when one bears in mind the fact that he advances the philosophical position called relationalism.)

Murray Gell-Mann himself did appear to offer us a middle-way between (strong) reductionism and the complete autonomy of the individual (“special”) sciences.

Gell-Mann believed that it's all about what he called the staircases” between the sciences. As Gell-Mann put it (in the specific case of the relation between the levels of psychology and biology):

Many people believe, as I do that when staircases are constructed between psychology and biology; the best strategy is to work from the top down as well as from the bottom up.”

What's more:

Where work does proceed on both biology and psychology and on building staircases from both ends, the emphasis at the biological end is on the brain (as well as die test of the nervous system, the endocrine system, etc), while at the psychological end the emphasis is on the mind—that is, the phenomenological manifestations of what the brain and related organs are doing. Each staircase is a brain-mind bridge.”

Interestingly enough, a man who has often been accused of “reductionism” (the American biologist and naturalist E.O Wilson) expressed a similar view in the following:

Major science always deals with reduction and resynthesis of complex systems, across two or three levels of complexity at a step. For example, from quantum physics to the principles of atomic physics, thence reagent chemistry, macromolecular chemistry, molecular biology, and so on – comprising, in general, complexity and reduction, and reduction to resynthesis of complexity, in repeated sweeps.”

So instead of Gell-Mann's simplicity and complexity, in this case we have the “reduction” and “resynthesis” of complex systems in “repeated sweeps”.

In addition, the philosopher Patricia Churchland (who classes herself as a “reductionist”) also advances a position which is similar to Gell-Mann's. In her case, she confronts the neuroscience-versus-psychology debate. And, in so doing, she mollifies psychologists about that scareword “reductionism” by saying that the

reductionist research strategy does not mean that there is something disreputable, unscientific or otherwise unsavoury about high-level descriptions or capacities per se”.

The words above can be summed up in this way:

i) Simply because a scientist (or philosopher) says that x can be reduced to y (not necessarily without remainder),
ii) that certainly doesn't also mean that this scientist (or philosopher) also believes that x is (to use Churchland's words) “disreputable, unscientific or otherwise unsavoury”.

Then Patricia Churchland goes on to say something that may surprise some philosophers. She argues that reductionism can exist side-by-side with what she calls “high-level descriptions or capacities”. This too perfectly expresses Gell-Mann's own position (as we'll see).

To return to E.O. Wilson, he was also well aware that the “very word 'reductionism'” has a “sterile and invasive ring, like a scalpel or catheter”. He went on to say that the

[c]ritics of science sometimes portray reductionism as an obsessional disorder, declining towards a terminal stage one writer recently dubbed 'reductive megalomania'”.

Additional Information

A seemingly extreme reductionist position is actually put by Gell-Mann himself (in relation to the domain of biochemistry). He writes:

The proponents of this view are saying in effect that going from the fundamental laws to the laws of biochemistry involves almost no new information, and thus contributes very little effective complexity.”

Here the case for complete reduction is expressed in terms of information content. Or, as Gell-Mann puts it, “going from the fundamental laws to the laws of biochemistry involves almost no new information”. Nonetheless, here we also have “complexity” alongside possible reduction. That is,

a computer might have to do a great deal of calculating to derive the near-uniqueness of biochemistry as a theoretical proposition from die fundamental laws of physics”.

That biochemical complexity is compounded by the important fact that it “also depends in an important way on history”. (Gell-Mann mentions “additional information” and “history” many times.) For example:

The laws of biology do depend on the laws of physics and chemistry, but they also depend on a vast amount of additional information about how those accidents turned out.”

However, that seems like the problem of complete knowledge, rather than an argument against reduction... Unless, of course, a lack of complete knowledge rules out reduction. But that would still mean that a reduction is possible... “in principle”. After all, that additional information may be entirely peripheral or irrelevant. (This is in the sense that if an investigating officer were to ask about the killing of a person, telling him about the colour of the neon signs above the dead body wouldn't help.)

So “accidents” and “additional information” have always been known or accepted by physicists – even by “reductionists”. It's just that they factored out their relevance. Were they right to do so? (Lee Smolin – with his “physics in a box” - and the philosopher Nancy Cartwright question all of this.) How would any physicist or scientist have ever denied that they were factoring out additional or extraneous information? Of course they knew that such information existed. The problem is that taking on board everything in every experimental (or scientific) situation is impossible – and even complexity theorists and “holists” must accept this. (That's unless they hold the position of Absolute Idealism like F.D. Bradley's; or one of extreme holism.)

Simplicity and Complexity

Gell-Mann gave two interesting examples of the opposite of complexity. Firstly, he wrote:

[In] the environment in question is the center of the sun, at a temperature of tens of millions of degrees, there is almost total randomness, nearly maximal algorithmic information content, and no room for effective complexity or great depth—nothing like life can exist.”

It's interesting that Gell-Mann lumps simplicity and “randomness” together at “the center of the sun”. By “nearly maximal algorithmic informational content” I take Gell-Mann to mean that in order to fully account for this “information content”, that content would simply need to be replicated in its entirety. Gell-Mann himself puts this case elsewhere. He talks about a “bit strings” and says that “it can shown mathematically that most bit strings of a given length are incompressible”. In more detail:

In other words, the shortest program that will produce one of those strings (and then have the computer stop) is one that says PRINT followed by the string itself.... It is called a 'random' string precisely because it contains no regularity that will permit it to be compressed.”

In any case, Gell-Mann then jumps to the other extreme:

Nor can there be such a thing as life if the environment is a perfect crystal at a temperature of absolute zero, with almost no algorithmic information content and again no room for much effective complexity or great depth.”

Here again randomness or lack of “algorithmic information” rules out complexity.

Complexity ≠ (Strong) Emergence

What stops the reduction of the whole of chemistry (for example) to physics can be summed up with one single word: complexity. This is how Gell-Mann puts it:

In practice, even with the aid of the largest and fastest computers available today, only the simplest chemical problems are amenable to actual calculation from basic physical theory. The number of such amenable problems is growing, but most situations in chemistry are still described using concepts and formulae at the level of chemistry rather than that of physics.”

These concessions don't rule out reduction per se. That is, there's no strong emergence being hinted at here. All that's being admitted to is that some chemical phenomena are so complex that it would be impossible to get all the information required to reduce a given chemical x to a physical y. That's not say that the chemical x doesn't reduce to the physical y. It's simply to say that the reduction hasn't been done... Indeed perhaps it can't be done. But even here there's still no strong emergence. The only thing that's accepted is complexity.

And because of that complexity, Gell-Mann goes on to say that

[i]n general, scientists are accustomed to developing theories that describe observational results in a particular field without deriving them from die theories of a more fundamental field”.

The very fact that observation is being stressed (elsewhere Gell-Mann also uses the word “phenomenological”) shows that the micro level is being automatically ruled out (as it were). It's also odd (bearing in mind traditional biases in physics - right up to the birth of quantum mechanics) that observation is being stressed at all.

In any case, here again reduction hasn't been rejected in principle, and that's because

[s]uch a derivation, though possible in principle when the additional special information is supplied, is at any given time difficult or impossible in practice for most cases”

So reduction is trumped by complexity and/or by the contingencies of scientific “practice”.

Gell-Mann then gives us a specific example of complexity trumping (strong) reduction. He writes:

[C]hemists are concerned with different kinds of chemical bonds between atoms (including the bond between the two hydrogen atoms in a hydrogen molecule). In the course of their experience, they have developed numerous practical ideas about chemical bonds that enable them to predict the behavior of chemical reactions.”

Despite all that, physicists and “theoretical chemists” are still hanging around on the sidelines. That is, “theoretical chemists endeavor to derive those ideas, as much as they can, from approximations to QED”. However,

[i]n all but the simplest cases they are only partially successful, but they don't doubt that in principle, given sufficiently powerful took for calculation, they could succeed with high accuracy”.

Gell-Mann goes into more detail in the following:

In very simple cases, an approximation to QED is used to predict directly the results at the chemical level. In most cases, however, lam are developed at the upper level (chemistry) to explain and predict phenomena at that level, and attempts are then made to derive those laws, as much as possible, from the lower level (QED). Science is pursued at both levels and in addition efforts are made to construct staircases (or bridges) between them.”

Here again complexity trumps (full/strong) reduction.

It's also worth stressing “causal” dependency here, rather than reduction. That is, one can stress causal dependency without demanding any kind of complete reduction.

That is, x can physically entail y (x can be a set of conditions, properties, etc.); and yet y will still not be entirely accounted for by x.

The Autonomy of Higher Levels?

Gell-Mann also puts the case for what E.O Wilson has called “concilience” (which doesn't rule out either reduction or reductionism) in the following words:

One lesson to be learned from all this is that, while the various sciences do occupy different levels, they form part of a single connected structure. The unity of that structure is cemented by the relations among the parts. A science at a given level encompasses the laws of a less fundamental science at a level above.”

If we have “a single connected structure”, then it's difficult to see how we can also have autonomy when its comes to the special sciences or to higher-level descriptions/laws. (Isn't this why the philosopher Jerry Fodor advanced what he called “strong autononomy”?)

There also seems to be a commitment to at least some kind of reductionism here. How else can we interpret the following words? -

[A] science at a given level encompasses the laws of a less fundamental science at a level above.”

Though it depends on how we interpret the words “reductionism” and “encompasses”. Nonetheless, Gell-Mann's words can be seen to be in favour of specific reductions; though not in favour of the philosophical standpoint of reductionism itself.

And then complexity does indeed raise its head:

But the latter [e.g. chemistry], being more special, requires further information in addition to the laws of the former.”

Here we need to know what the words “further information” mean because, clearly, that further information may not block (or rule out) reduction or even reductionism itself.

Yet despite Gell-Mann's acceptance of the autonomy of different scientific disciplines, it may seem strange that he should also argue (of psychology) that it is “not yet sufficiently scientific”. What's more, he argues that his

preference would be to take [them] up in order to participate in the form of making them more scientific”.

The words above can be read in two ways:

i) The “special sciences” aren't autonomous.
ii) In order to make the special sciences autonomous, we would need to “make them more scientific”.

Of course I'll now need to explain why I'm using the word “autonomy” here.

I do so because, for example, the theoretical physicist Sean Carroll often stresses the autonomy of the special sciences and higher-level descriptions. (The philosopher Jerry Fodor also stressed what he called “strong autonomy”.) Indeed Carroll's advances the “autonomy” of what he calls "emergent theories". (This is a vital part of his “poetic naturalism”.) Carroll writes:

The emergent theory is autonomous... it works by itself, without reference to other theories...”

Elsewhere, Carroll says that with strong emergence “all stories are autonomous, even incompatible”. Yet, in other places, Carroll also stresses emergent theories and their compatibility with fundamental theories. Indeed Carroll hints at a lack of complete autonomy when he says that “we might learn a little bit about higher levels by studying lower ones”.

Carroll also emphasises the “mapping” of a fundamental theory onto an emergent theory in a process called “course-graining”. So how can we have mapping as well as autonomy?

In a seminar, Carroll also used the word “consistence” in reference to the fit between emergent and more basic theories. How can that consistency - between two very different autonomous theories - be established? Carroll also assumes compatibility (clearly related to consistency) between emergent theories and more fundamental (basic) theories. In addition, Carroll says that (some) emergent theories are accurate... Who says so? Does Carroll simply assume an accuracy that's tacitly and essentially guaranteed by a more fundamental (or basic) theory - thus limiting the emergent theory's supposed autonomy?

In opposition to Sean Carroll, it seems that Gell-Mann didn't believe in this (complete) autonomy. That's because he believed in both a “bottom-up method of building staircases between disciplines” and a “top-down approach” as well. Yet if the higher-level disciplines were autonomous, then why would they require either a “top-down” or a “bottom-up” method? Surely they could stand on their own feet. Indeed the fact that Gell-Mann even raises the question of both bottom-up and top-down approaches (or methods) means that he did indeed have a “bias in what invites the charge of 'reductionist'”. In other words, because Gell-Mann didn't believe in the (complete) autonomy of the special sciences, he could be classed as a “reductionist” - as he says himself. A non-reductionist, on the other hand, would say that the special sciences are autonomous and don't need to account for themselves (at least not via lower-level disciplines).

Higher-level Laws

Despite Gell-Mann's “reductionist” inclinations, he still believed that there are new scientific laws at higher levels. He wrote:

At each level there are laws to be discovered, important in their own right. The enterprise of science involves investigating those laws at all levels, while also working, from the top down and from the bottom up, to build staircases between them.”

And what Gell-Mann said of chemistry, he also said of biology. That is,

like chemistry, biology is very much worth studying on its own terms and at its own level, even as the work of staircase construction goes on. “

Gell-Mann also cited psychology and even the social sciences and history. In full, he wrote:

[I]t's essential to study biology at its own level, and likewise psychology, the social sciences, history, and so forth, because at each level you identify appropriate laws that apply at that level.”

However, Gell-Mann did qualify all this by basically saying that higher-level laws are dependent on lower-level laws - “plus a lot of additional information”! However, that dependency doesn't in and of itself mean that completely new laws don't exist at higher levels. Nonetheless, all that additional information may not be lawlike – at least not as yet.

Gell-Mann continues by talking about “staircases” again. However, at a prima facie level, none of his talk about staircases rules out reduction; or, more specifically, it doesn't rule out the reduction of higher-level laws to lower-level laws. So it may be a little surprising that Gell-Mann finishes off by saying that

[a]ll of these ideas belong to what I call the doctrine of 'emergence'”.

Here all that can be said is that Gell-Mann is stressing weak (rather than strong) emergence. And, according to the philosopher Mark A. Bedau, “the notion of weak emergence is metaphysically benign”. Strong emergence, on the other hand, certainly is not.




No comments:

Post a Comment