The Limits to Linear Thinking & Methods

By Herb Wiggins, M.D.; Clinical Neurosciences; Discoverer/Creator of the Comparison Process/CP Theory/Model; 14 Mar. 2014
.

Largely the old ways of linear thinking are coming to an end. This can be demonstrated in many ways, but the Procrustean mythos of the old Greeks was about this real, limiting problem. The false dichotomies, ” A —> B only” mentality of machines where pushing a button yields only one outcome, whereas as in complex systems (most of the universe of events, 99.99+%) can do much with little. Linear logic and methods have reached their limits to growth. We are clear about this because of the rule of diminishing returns of incompleteness and oversimplification. The simple logics and methods are becoming obsolete and being replaced by far, far better methods which can do more with lots less, and thus grow faster, and do what linear methods cannot even dream of. This change of epistemology and paradigm marks the end of the machine age, in short.

That logic and forms of math which use those recursive logics are not complete as Godel’s Proof shows, more accurately called Godel’s Incompleteness theorem. And it’s easy to see how this works. If everything is either A or not A, and there is no excluded third, as formal verbal logic demands, then this is refuted greatly by the statements that events are either White or black. This is but the fallacy of the false dichotomy, which misses the unlimited variations of shades of greys between the two. Thus our logic eats itself like the Ouroboros snake. It’s intrinsically not logical. Or using another approach the koan of Ulam who stated, calling the universe “non-linear” is like calling biology the study of All non-elephants. It misses millions of species. And those are why linear methods are incomplete, and we are reaching the end of the machine age. It’s simplifying, but too much so. It’s to some extent an easier way of looking at things, but it misses too much to be very complete. & that’s the problem. Light speed measurements, which contrast greatly with QM interpretations of the same events which also elide over and cover up this fallacy will be addressed below.

.Another example is that of graphite reactors, which are complex system at low power due to small areas of fluctuation which propagate from chain reactions beginning at some sites more than others and create instabilities, which can blow up, compared to more sustained, more even linear fissioning processes at higher power outputs.

.The same kind of widely varying rates of fissioning at low power are unstable because some areas will rapidly grow by input/output mechanisms. As a few neutrons hit a few atoms those fission, releasing more neutrons, creating the very quickly rising, notable chain reaction. But in fact, those take place at the onset of fissioning at random areas, where more or less fissioning at low power takes place. This arose from input/output effects, where by chance chain reactions grow exponentially simply by random events, and in other places not so much, due to variations in purity and interactions of fissioning centers as they coalesce and become more even, pretty much homogenous fissioning areas at higher power. The instabilities come at shut down, and powering up. This is what destroyed Chernobyl’s reactor block 4, to the destruction of the USSR, and continuing dreadful radiation contamination problems existing widely over central Europe, today. It’s a complex system being ignored by linear thinking which likely created the problem.

.Very likely had the first nuclear pile, created by Fermi and Leo Szilard, the latter who first realized the possibility of the nuclear chain reaction and patented it for himself, was a similar kind of graphite moderated reactor used at both Windscale’s disastrous events, and Chernobyl’s, the RBMK-1000 models. Had Fermi and his team been less lucky, they could have blown up several city blocks in Chicago and essentially badly damaged or destroyed much of the University of Chicago and even that great city. But nothing is gained by not trying out new methods, although hopefully, as in Alamagordo which was way out in the middle of vast empty areas, showed they’d learned their lesson, at least in part. Those are how complex systems can be applied, as well.

.

As another example of linear thinking ignoring complex system effects & events, look at a weather vane, turbulent flow and plane crashes to see the aspects of complex systems being missed by the forced linear, Procrustean bed approach of eliminating most of what’s going on. For simplification for sure, but still, yields wildly and serious incomplete methods and models. A weather vane generally shows wind direction. But let’s take a complex system, more complete look at what’s much more likely going on. Compare the direction of the wind using a heavier weather vane versus a much lighter one. (Wind socks are ever much more so linear, too!) The heavier weather vane weighs more, uses inertia of matter to even out all of the other directions of turbulent flow of wind. So do wind tunnels, for that matter. Both are highly linearized methods which ignore what’s much more likely ongoing.

.
For instance, the light weather vane on a breezy day will point generally in wide ranges of directions, but on the other hand, it varies its directions a LOT more due to the real, turbulent flow of the wind currents. Thus it can move, literally in any direction as does the turbulent flow of wind, which is the case, & will create vortices, eddies and other features developing from normal wind flow, so that the lighter vane can point even oppositely at times compared to the heavier. & light vanes can even spin about whereas the heavier vane will not be so likely to do that. Thus the heavy weather vane is linearizing wind events and creating a false output, which while it simplifies interpretation, is in fact not the case and thus not a very complete description of wind flow.
.
Further, Turbulent flow is growth capable, which is why a butterfly’s wing flapping can create via growth processes the outcomes of Tropical Storms to typhoons. & the same growth capabilities of complex systems, create the transforms relative to emergence in complex living systems as well. Turbulent flow is created by this constant adding up and proliferation of events.
.
This omission of real data, ignoring existing events has serious implications for aircraft  flight. As mentioned above, wind tunnels give a highly linearized kind of flow, which is NOT what’s seen in the real life of turbulent flow, i.e., winds. Thus the efficiencies of such wings are just about ONLY seen at higher velocities, rather than the lower. As the fixed wing planes fly slowly, the wind flow over wings is necessarily more turbulent. As they speed up, it becomes more linearized. This creates a problem at lower speeds, too, because the fixed wings are designed for high velocity. But, and this is the key point, slowing down and landing and speeding up from stopped to lift off and beyond are NOT as linear. That’s largely what creates the most crashes which are very often seen at take off and landings. These observations should have  very profound effects upon wing and aircraft design, as well. And are why it’s also good to fly into a head wind when landing and taking off, not just because in the latter it slows the craft down, but in the former,  it rapidly increases the linearity and diminishes the turbulent flow, too!! It’s not just a linear phenomenon, we see. & with VTOL’s was an early problem with acceleration and take off, too.
.
This is important because it’s at lift off and taking off, as well as at landings that MOST crashes take place. That is, during the not linear wind flows around those fixed, linear wings, where down and up drafts, cross currents, and other normal wind phenomena of turbulent, chaotic wind flow, easily and provably observed takes place. Thus many plane crashes are a result of wings and lift which mostly work best in linear flows, and tend to crash at other times. Even at higher speeds, hitting the jet stream, or a cross current flow of two intersecting weather systems can do much the same thing, the so called, “clear air turbulence”.
.
And there is that turbulent flow problem again.and this is the real point. Birds don’t use fixed wings, and they use moving, complex systems to fly by flapping, changing their wing aned body shapes and movements in many different ways. As those are complex system they have much more capabilities to adjust to cross currents & turbulent flows by adjusting automatically in ways which trial and error systems have found to be effective. & they can even create & perform the most amazing flight acrobatics, as well. As our vector thrust aircraft still being developed, can do a lot of it as well. Thus VT is a more non-linear way of flying because of the unlimited combinations  two thrust vector engines flows can create.
.
These VT create non-linear, complex system flight paths which linear jets cannot begin to do, nor compete against. The 90% and 99& kill rates of the vector thrust Indian MIG’s and Sukhoi’s, seen against US F-16 advanced fighters in a mock air battle showed this back in 2004-5. And those have been carefully hushed up and not widely broadcast, because it showed, among other things, that complex system approaches were far, far more capable than linear, fixed wing methods so long used. Similarly, thrust vector missiles and torpedoes also have such capabilities, because they are not subject to linear flying and sailing which their target must be. & can outfox anti-missile systems which expect inertial and linear flight paths.
.
But how can we deal with turbulent flow? & that’s been the problem of solving and understanding turbulent flow problems. It’s rather simple in fact, and goes to the heart of how we know what we know, that is, our epistemological assumptions. Our cortices can solve those problems by a simple series of methods using comparison processes. They find the stabilities in complex systems, those repeating events which can be seen again & again, recognized, remembered and recalled without limit. Those observations, which the birds have made with their 220 millions years of gliding and flight and have passed into their instinctual complex system of flight skills created by their nervous systems. Those modify wing shape and movements to minimize the interruption of flight every 5-10 ms. or so. Which linear system, fixed wing aircraft cannot possibly do. That’s the secret. Our complex system brain outputs ignore linearity and looks for stabilities (most often created by least energy effects) and patterns which they can recognize, and within those create a working, practical, trial and error driven model of efficient flight. Surely it takes longer to develop and use, but they don’t crash that often as do our aircraft and are at home in flight through turbulent conditions because rather than trying to linearize it, they take advantage of all of the unlimited options to fly with greater safety, skills and abilities than fixed wing, mere machines can do.
.
This is what a consideration of weather vanes can show. And once we realize those facts, find the appropriate answers/solutions, then the crashes at take off and landing due to turbulent flow will simply disappear, largely. These are the practical benefits of complex system thinking replacing highly limited, incomplete and not very capable linear thinking.
.
Weather is also complex system of multiple factors of wind speeds and directions,  of humidity, density and temps of air, and their complex interactions. As a result the huge numbers of weather of all types cannot be understood very well using linear methods. Thus we have found, that like the simpler complex systems of dice, and the more complex systems of quantum mechanics, that we must use probabilities, which find more of the stabilities and repeating events which lead to understanding. Once again, our ability to input the outputs of events to find the higher order and relationships, which creates our hierarchies of understanding our weather. The seasonal temp changes, the more rain in the spring where cooler and warmer more humid weather creates rain, thunderstorms, even hail, lightning and tornadoes, those attendant dangers, which weather forecasting can develop and use to more cleanly predict what’s going on. Thus we can better guard ourselves against flooding, lightning strikes, powerful storms, wind & hail damage and so forth. Probabilities are one way which we can better deal with, by trial and error outcomes, bad weather and good, heat and very cold weather, as well. It introduces more predictability using math probabilities, than is possible with linear methods. Using the far, far older and still workable comparison processing of recognition and then pattern recognitions. Indeed most math models are developed from the earlier recognition and pattern recognition models as has been shown in Relativity, which was first created by recognition systems, and THEN mathematized after the fact by Minkowski and Einstein. This is where our sensory information has been converted to mathematical forms, although highly linear methods, which also miss a lot of the complex systems details, too.
.
With respect to the linearization of complex systems in which medications/drugs are used, this also shows this same kind of epistemological shift from linear systems, that is, drug “side effects”, which are in fact complex system effects. & calling them side effects, simply ignores, oversimplifies, and misses the outcomes needed to increase understanding without limit. Have discussed this before in:
.
Peruse about 1/2 way down the article to the paragraph beginning:
“Lets use a clear, medical application of complex systems thinking….. “
.
But suffice it to say, the linear method of “take the pill” and ignores the “complex system effects”, is best replaced by complex systems thinking, a far, far better way of doing things, & is leading to a new pharmacology, where multiply active drugs, have far, far more beneficial effects, that just one, linearized pill, & with fewer side effects. More likely, it uses widely those “side effects” & can be our friends. & as current research shows bacterial drug resistance is being attacked by using just those methods. Understanding the deeper ambience, the myriads of milieus of complex systems, allows us to give a dual drug, which blocks the beta lactamase used by resistant bacteria against the highly effective class of penicillins and cephalosporins.
.
Thus the PCN starts to work again. & there are other mechanisms which can be used, such as triple therapies, long used in TB and Staph infections, among the more common and earlier forms of drug resistant bacteria. Using 3 antibiotics together in different combinations can create a kind of complexity of interactions, most notably, that the bacteria will have a very hard time figuring out and adjusting to. The trinary method creates, as N = 3 or greater, a complexity of such vastnesses, it’s all but impossible to overcome, therefore. We use complexity against bacterial resistances, in other words. We make the combinatorial complexity too great for bacteria to very easily overcome!! Thus complex systems thinking, extending our models well past the linear “take the pill” with the least side effects (ahem!) approaches.
.
This is also how sildenafil can be made to last 3 days. And how drug resistances of most all types can be attacked and blocked. By using 3 or more meds which attack HIV, much more is attained, as well. Complex system approaches, which have been created by trial and error. & sadly those using them haven’t any kind of good model to understanding WHAT they are doing, and why and how to expand greatly with this complex system knowledge to create unlimited amounts of further, useful approaches & work. This is the power of complex system understanding and knowledge and why this can go & grow, without limit, too.
.
Largely, though, what we create is what we are looking at. We mistake the “precision” of our figures for something which actually does not exist!  We linearize the outputs by least mean squares, and then admire the accuracies we achieve. This is false reasoning and not the case. The linear points from which we artificially create a line from, by “least mean squares” methods. The measuring of high exponents numbers of events, which we then create a very tight, supposedly highly accurate figure, which disappears when we only measure a few of those points. We sum up endless measurements and get a very tight figure for “cee”, but if we measure only a few photons, we find that those photons are travelling both below and above light speed.  Gleich discussed much of this in his “Chaos” book. The scatter of points so often found in experiments, and then linearized using a lest means square line drawn through all those point is omitting, ignoring and missing actual events.
Extending this further, this is the light speed problem in a nutshell. When 10 exp.30 photons are measured, a very, very tight, linear speed of light is obtained. But this is an illusion. When 10-20 photons ONLY are measured, the velocities will likely be both FTL and slower, too. There will be a distribution of photon veolocies, NOT each photon traveling at ONLY Cee!.  This is the source of the illusion which linear methods create. When Eugene Wigner in his article published in 1955 showed that helium atoms tunneling out of a radio-isotope did not ALL stay below light speed, but that some were transluminal, his findings were ignored and physicists refused to come to grips with the Procrustean bed linearizing of events in existence, which they were doing.
.
The summation is not real, the certainty and precision are created by our methods, which hides the awful truth that at the quantum level of events, such precision is NOT possible. This has been shown before when we measure events of specific types again and again, with more and more precise methods which cost more and more to do. Eventually we reach the limits of our precision, but we fail in finding that last digit. The Law of diminishing returns kicks in, as it has in particle physics, trying to reach the end of the exponential barrier of light speed, and we miss what’s going on. We are looking at the forced results of our methods, not necessarily at events in existence. We are looking at the outcomes, the limits to our methods, not necessarily events in existence. We create the Procrustean bed of linearity, forcing events of all sorts into a straight line, when in fact, those events are not likely so.
.
Let’s carry this further. By measuring each event with the series of measuring methods, as has been stated before many times, we at first eyeball the length, and get an approximation, say 8 inches. Then we use a good tape measure or rule and get the length to say 8 1/4 in. Then we use a very fine metal draftsman’s rule, and get it to about 8 9/32ths. Then a finer metal rule to 8 & 17 64ths. Then we use a micrometer and get it to 8.25773. But at each level, no matter if we use a light microscope, an electron scope, or use light interference to measure down to nanometers, we still have in each case of meauring, the +/- error of measurement, and each method costs more and more. We are aproaching the rule of diminishing returns, the exponential barrier limit of most all measurements. There is NO final digit possible to be found. There is always a limit to our measurement! Events in existence are not likely to be measurable to any final digits. Further with changing temps and pressures physical characteristics of objects will, at microscopic levels, change shapes & sizes as well. Most all events are very likely irrational numbers due to these limits in our measuring methods, intrinsically.
.
And this is the point, linearizing events in existence imposes a Procrustean bed of unreality upon them. Colors are NOT a visible, linear spectrum of wavelengths and energy and Planck’s constant, and energy, AKA E=h times nu. Walk into a paint store and look at their huge collections of color palettes, which mix up all those wavelengths with combinations of colors NOT seen in the EM spectrum, nor even hinted at. And the unlimited gray scales between black and white, all of which create the unlimited vastnesses of colors which are detectable by our eyes. The realities of what colors are much more likely to be. Mixtures of spectra of light photons of varying numbers at many frequencies, combined with both white to black and all the unlimited shades of grey, fixed by the quantum emission events of electron levels, and in some cases nuclear gamma radiation, too.
.
This is the problem. Photons making up colors are much, much more than a linear scale of light frequencies. They are in fact combinations of photons of many frequencies, which we can both see, detect and often not detect. Our eyes see at a macroscopic level, which sums up and misses most all of this unimaginable detail and indeed created synthetic colors, such as brown, and mixes red with blue and gets violet, and blue and green and gets aqua & various shades of turquoise, too. Colors are constructs of our brains. We miss most of the details in terms of the numbers photons of light at each frequency, and in fact we cannot begin to measure that complexity, either. And there is NO final digit to brightness, nor the mixtures of frequencies, either. Light is NOT a linear scale but saturation, numbers of photons, and kinds and numbers and intensities of photons, you see. And none of those have ANY final digits, either! Events are essentially irrational numbers at a deep quantum level.
.
But understanding this, we can see beyond the highly simplified, linear series  of low and high frequencies of light. Just as we can see beyond linear number lines, to the more complex events which lie behind temperatures, which are essentially simplified down to speeds of particles and atoms. Near absolute zero, which we cannot EVER reach, characteristics change. Bose-Einstein effects take place which are NOT predicted by classical physics. Near light speed, characterizations change, too. In either case our reality is within a well at either end between an exponential barrier of Cee and the same at absolute zero. Whence come these events? There is something beyond absolute zero as well, which our linear methods do not account for, too.
.
But cee can be exceeded, because measurement of photons and QT of alpha particles, helium atoms from radioisotopes, shows FTL, which creates the average of light speed, does it not? We are summing not ONLY up from below light speed, but above it, to create the statistical exact light speed.  Cee is a tight figure because we use so many orders of magnitude of repeated measurements to make it that way. But on either side of cee, there are still photons moving both below and above. Accordingly, QM states that cee is a statistic, NOT an absolute. Thus we get “beyond the absolutes”!! Our models are not complete!!
.
.
The same is true of pressure measurements. At normal STP we see solids, liquids and gases, all of this on the linear scale of pressures and temps. But it’s complex and is temp and pressure dependent, and NOT linear. At very high temps we see plasmas, and not gasses. And at very high pressure we get this hierarchy, which is quantized, of solid, compressed solids, including the unusual compounds not seen at earth surface pressures. The white dwarf star matter, then neutron stars, and then finally black holes. The latter two of which likely evaporate due to quantum fluctuations of neutron –> proton, energy, electrons and antineutrinos. And the black holes which evaporate by violations of the fact that some particles and photons will escape from the black hole in pairs, when they exceed cee. Compare and contrast this with evaporation of radioisotopes, which also Qu. Tunnel particles out of the highly compact nucleus, where high densities slow down processes such as neutron decay times. Within many isotopes, the neutrons are not limited to about 15′ of life, but as far as we are concerned, an unlimited life span, within a high gravitational field.
.
Yet another limit is the way our cortical columns (CC’s) process information. We read, and count and speak, linearly, because that’s the way the CC’s work. Thus linearity is built into the system. But by using hierarchically ordered means, we can figure out complex systems by looking at stabilities, and linking those together by CP. This creates & writes the hierarchies of our understanding, and also how we read and speak, linearly. But it gets around linearity by allowing us to understand the patterns, relationships, and associations of those groups of events, by thinking about them visually, and not just by using linear speech, thinking and so forth. These visualization and hierarchical methods, using both taxonomies and hierarchies to order information works a lot better than simply induction and deduction. Thus our recognition systems driven by the higher logic of CP get around the linear to create understanding of complex system by this means.
.
In addition, we can parallel process in our minds. It’s built into the system, because we know that some persons can multitask. It’s a skill, tho not common. We know that Julius Caesar was dictating to one scribe while riding, at the same time thinking about what he was going to dictate to the other, and so composed at twice the rate. & because the two groups of composition, they could interact with each other, this allowed a deeper analysis of events to take place. The two dictations were interacting, and the whole becomes more than the two apart.
.
In the same way, a chemistry prof could not only lecture to us, but also plan what he was going to do in the lab that coming afternoon, at much the same time. & he could integrate the two when needed, also. As our CC’s DO operate rather independently of each other, although many are connected in real time, this also is a way of rising above the limits of linear thinking and data processing. Indeed we are thinking in real space, while walking down the street; we can be listening to music while writing; and we can also be seeing, while hearing, while talking, much at the same time, too. This vast integration, which might be very possibly creating a complex system of outputs, may in fact be creating the non-linear, complex system which we call consciousness. This then, might be a series of solutions to the problems, of rising above the limits of linear thinking and forward flowing information processing, too, in the CC’s. And sounds a good deal like what the many brain functions going on inside our heads are doing all the time, too.
.
We can be aware of our breathing and controlling that, tho it falls back to brain stem, sub-conscious control much of the time. For instance we learn subconsciously to control our breathing while speaking, and must. It’s essential to talking normally, too. Talking means we are moving air thru our vocal chords to make normal speech. Thus, much is being done, as well as lip, tongue and other movements of the palate. These combined activities, multitasking in other words, which we do all the time while playing 10 fingers on a keyboard, show this multitasking capabilities of brains. & not only that, but the player of the keyboard, can simultaneously  listen to the orchestra around him, while playing as well, as do the other member of the orchestra, band, choir, and so forth. All of these events, imply  innate, multitasking, simultaneous activities, which can produce, much, much more by working together, than alone. &  shows thus that the normal brain is likely operating as a complex system, and NOT linearly. Thus we must learn to understand how these multiple tasks are done together, and then extend it in our normal brain activities. Then learn to facilitate it and then teach it to others. This represents yet another way in which we can use visual thinking to extend our understanding of complex systems, too. The unlimited options given us by normal brain, operating well out of the forced, linear methods we can grow far, far too accustomed to, as well.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s