The S-Curves of Growth

 

The S-Curves of Growth; A “Vade Mecum” of Complex Systems
.
By Herb Wiggins, M.D.; Clinical Neurosciences; Discoverer/Creator of the Comparison Process/CP Theory/Model; 14 Mar. 2014.
Copyright © September 2019
.
“Almost anything which jogs us out of our current abstractions, is a good thing.” –Alfred Whitehead
.
“Any society, (or group in a society) which cannot Break Out of its current abstractions, after a limited period of growth, is doomed to stagnation.” (verbal description of the S-curve)  –Whitehead
.
“I hold that a little rebellion now and then to be a good thing.”  –Pres. Thos. Jefferson
.
“QM is not wrong, but it IS Incomplete.”  –Albert Einstein.
.
“Calling the universe non-linear is like calling biology the study of all Non-elephants.”  —Stan Ulam, the founder of complex system studies.
.
“Mathematics must greatly advance before it can describe complex systems. ”  –Ulam
.
“Most Every method/tool has its definable Capabilities & Limits ” & “It’s a good craftsman who knows his tools.”
.
“The World is NOT my Idea!!”  The fallacy of idealisms refuted.
.
 “The Word is Not the Thing.” –Alfred Korzybski, founder of General Semantics
.
The differences, largely between professionals and amateurs is neatly summed up by CP of the efficiencies of their work. Those efficiencies drive the growth. And figuring out which methods create the best growth in all of its aspects, gives a good outcome. & often we cannot calculate the outcomes, but we DO test them. &  that can give us the info we need to decide which methods are the best in most cases.
.
The brain is uniquely designed and developed and can be trained up to do exactly that. Thus solving a Cx Sys question, where math and logics and other related methods, cannot go or grow.
.
From the simple comment above by Whitehead, Growth, and then stagnation, we can see the verbal origins of the practical empirical application of the “S-curves” to understanding Complex systems (Cx Sys, acronyms & contractions are linguistic, LE devices!) much more detailedly. In fact, very likely nearly universally describe & apply to the growth, which can take place in such new, efficient, growth creating emergent inventions, ideas, devices, tools, methods, skill sets, techniques & technologies. The whole rich panoply of  CxSys  seen with the understanding, analogy and methods, techniques & technologies Kategoria of Aristoteles. The Synonymics and the larger related phrases groups, too. And those form the bases of our richer hierarchies of our organized understandings, in most all fields.
.
This details the Complex System (Cx Sys) S-curves of growth and where those occur, virtually without limits in our universe of events. And this is, in part, how to mathematize them, as well.
.
 Addendum: The  S-curve application to Moore’s Law
.
One of The most important and totally unrecognized S-curve is Moore’s Law for silicon computer chip growth. This empirical finding/event is the linchpin in showing the huge value of the S-curves seen in Cx sys.
.
For many years the doubling time of each new generation of Si chips in speed and decreasing size was 2 years. But that is constrained by the S-curve for TWO huge reasons. First of all, any innovation has its capabilities AND its limits, and that’s what creates the S-curve. The new efficiencies, least energy driven, create the exponential growth seen in Si chip technology. But the limits then begin to hit, and the growth at the mathematical flexion point of diminishing returns begins. That was first manifested by the increasing cost of each new chip, which was moving rapidly towards an asymptote at the top of the S-curve. The next major limits to growth were that as the transistors got smaller & smaller, the quantum leakage at about 4-8 nm size would become so serious, the new chips would not function. There are others such as increased heating and so forth, but both those have hit. S-curve limits struck Moore’s Law very clearly right at the Fab 40 time, and the effects came out very much later than expected, thus signaling, clearly that the S-curve of Si chip growth was active. & the increased costs & time to manufacture upwards of $billions in costs and years of work, had come. Moore’s Law Failed, & that HAS been clearly shown in the last few years. AND it was the S-curve of growth in action.
.
Thus proving once again, Whitehead’s deep insights. “That society (or group) which cannot break OUT of its current abstractions (methods/techniques, technologies) after a limited period of growth (the expon growth of Si chips), is doomed to stagnation.” And that has now occurred. The S-curve with a vengeance, and very, very clearly seen and very much ignored, too. Whitehead’s insight from 100 years ago, still applies today because it’s a nearly universal characteristic, as well.
.
Thus the power of the S-curves to predict the limits to growth (Herman Kahn) once more arose and was entirely predictable by plotting the growth of the chips’ sales and use, versus time. And we are NOW at the top of the Si chip S-curve & it’s flattening out against the NEXT exponential barrier as clearly seen many years ago in my models.
.
Thus, where from here? And the answers are many. Josephson circuits (JC) are the next possible solution much, much easier than quantum computers. And if want to sort out the complexities of the QC, we MUST have a JC computer to do that for us. Then we can make the QC many millions of times faster with less work, as well as explore the nearly unlimited capabilities and speeds of the JC computer, too. When that JC reaches its S-curve limits, presaged at the Flexion point of diminishing returns, then it marks the time to double down and create the QC a lot faster. The same HTSC’s we can use for the JC WILL work on the QC, as well!!!
.
Trying to get ahead too fast, is the problem. Can’t run before we can safely walk. is the epigram here. The temps of the Superconductors are the problem. HTSC’s work and can be used to make the JC computer, But, and this is the case, what is the limit to SC temps, if any? That key detail and model hasn’t been clearly discovered nor found out. But it’s easily amenable to combinatorial chemistry methods and work, too. Again, we use empirical outcomes work (& Structure/Function relationships) to create our models of how events work. As has also been addressed in my work, too, regarding how to eliminate most all antibiotic resistances by the microbes for the foreseeable future. Which key insight and connections have ALSO been missed by the old boys.
.
And so the expon bars hit once again with their asymptotic limits because the nature of events in our universe re’ the S-curves, creating creativity and so forth have NOT been widely figured out, AND understood, tho the answers ARE there!!!
.
Read more about those pesky exponential barriers: sections 7 down thru 11. And what they mean in terms of how those very likely arise.
.
.

 

This is a continuation of the theme in the MOE article which is needed to show in much fuller detail and instances how growth occurs mostly via S-curves and why those can be of such great value.

.
MOE here:

https://jochesh00.wordpress.com/2019/07/13/towards-a-model-of-everything-moe/

.
And we connect the S-curve as well to the capabilities and limits of growth. Nothing works forever. No tool can do it all. We use a full toolbox of such methods, both in terms of professional skill sets plus the devices, instruments and other methods we use to get our work done efficiently.
.
The VIPoint is that the growth is derived from Least Energy effects, which can be estimated if not measured by comparison processes (CP). Our brains do this automatically when they describe events using words. They choose the best, most detailed and accurate words to as precisely as possible define an “event” when trying to communicate that to others, or in writing or problem solving. If we cannot with the first try, we try it again. “Play it Again, Sam.” of Trial & Error sorting methods, writ large. We have an entire Kategoria of Aristoteles to do that, in fact.
.
The point is, that each method to be tried can be given a comparative efficiency rating. Does it do the job in the shortest time, the least cost, the least materials, all of the major aspects of the  Cx Sys of the 2nd Law? But some of those do combine, and in many cases, such as time savings using a least distance, least energy methods, it sums it all up. but is not intrinsically complete. Thus we must at time check the whole Cx Sys. of the 2nd Law to be sure. There is no math at present which can sum up those complexities. It must greatly advance to do that.
.
AND because there are multiplicit factors, the S-curves are, if used in tandem with outcomes studies, more completely describe such growth.
.
.
Now the warning by Ulam comes into effect. We can ONLY piecemeal, but not all at once use math to tell us HOW much that figures it. Math has to do all the pieces of it, individually, and that’s the problem with using a math to figure CxSys, which we do not have. Maths do NOT sum up all of the distances, times, LE, cost, least materials, etc., at all. Only word descriptions can do that. Thus we estimate the outcomes, whenever we do anything practically all day long. We pace off the distance. We get an approximate time by comparing to a watch. We derive an answer as to what we think is likely to be correct. It’s NOT scientific, and it often does NOT measure precisely, but we have no time to DO ALL of That!!  Thus both the maths and the sciences fail us, daily, practically. We can’t make our day to day decisions based upon scientific, peer reviewed studies!!! Nor can we ever do so. This is how, empirically, introspectively we get about, day to day, very likely.
.
Walking cameo article here:
.
.
This is a major problem in current science. Therefore we use this simple, LE guide. How far does it appear to be? How heavy, how long does it take & so forth? Thus are the practical rules created by LE rules, which guide us in all day long in this. This is Personal Knowledge, and it works. Carpenters use their highly efficient “eyeballing methods” all the time to save time in measuring. And they are pretty good at that. That also, specially exemplifies the difference, yet again between a professional & amateur.
.
When once read a lovely book on Arkle, the great UK racehorse, did the horse trainers scientifically verify all of those details about raising the horses? Of course not, but they knew HOW to optimize the outcomes of the training. No science. And this is the point. Despite that they reached the ability to ID and train very good horses to do the job. Now, HOW? Personal knowledge methods work!!! And we solve that Herein, by showing what each specific details of the skills they used to do the job. But the point is, it’s CxSys. There are so many factors to be considered, the math fails us when N=/>3. Thus we use the methods of Ulam, find the repeating stabilities in the CxSys, and then understand those. And the Monte Carlo method is very practical, empirically based and guided, because it takes such findings, and using those empirical standards, & explains it better. We look at OUTCOMES to solve the N=/>3 complexities, very likely, day to day, hour to hour.
.
But look, for each part of the 2nd Law of Thermodynamics CxSys, we must do math for the length, distance, time, weight, cost, etc. savings. The math cannot treat it as a complex whole, just do it piecemeal. Thus as Ulam stated, it must advance greatly if it is to describe the aspects of the Kategoria. That’s one of many unsolved problems, but a major one.
.
Then in the same way we can create a driverless car which will work.
.
.
The S-curves are figured out in the brain by estimation. They see the OUTCOMES of multiple trials, as do the bees in finding the traveling salesman solution of nectar/pollen collections. They do NOT use math, but None the less create a sort of 75% solution of theoretical optimum, how to do that.
.
This is further explicated in
.
.
A short cut by walking is a LE path, clearly. We have LOTS of names for those, which can fit each of them into a Kategoria of Aristoteles. And by doing so can find a practical way to solving the problems.
.
The S-curve method, however, CAN solve this more directly, as well. and for this reason. It can give an estimation of the two factors, the markets, emotional, Dopamine drive, AND the estimated LE, CxSys energy savings. The two together will create a number which can effectively as possible estimate the slope upwards of the S-curve if the product is used. Initially, there will not be a lot of data on those. but over time, a large build up of information from which patterns can be found to make the system ever more reliable and approach high levels of utility and reliabilities.
.
This is what the S-curve approach can give. Is the product marketable and efficient? Or the one or the other? Clearly if money can be made because it’s mostly driven by emotions, the meme (R. Dawson) spreads over time, thus fads and Fashions, then it can be estimated with that alone. But fads and fashions do NOT endure because they are most of the times, NOT LE efficient. Durable goods ARE, however. And that creates their persistences, & stabilities, where fads and fashions fail. Thus we see that LE also describes very well the difference between quality and value and durability versus, and “this too shall pass” of fads/fashions. Amazingly fruitful and explanatory!!!
.
“The Spark of Life & Soul of Wit” article here:
.
.
 But for durable goods, it must ALSO give a solid, LE advantage over the competition. Because people can tell which methods and technologies work better by their estimates, as well, over time. The brain is very good at doing what math cannot. Solving Cx Sys, multifactorial questions by comparing outcomes. They can be very good shoppers (CP most all!!), comparing goods for the most durable, the least cost and best values for the money, etc. By the above outcome estimators built into the brain. That sums up the many factors and makes a decision . That then is how it’s done. We must do the work. It’s not magic, but it can very likely provide may good answers as to whether a device, tool, fad/fashion will work or not.
.
Steve Jobs was very good at this. So are the Japanese with cars. They make the cars Fun to drive, by the steering, so it sells better. That emotionality is a big factor which has made them a lot of money, PLUS the clearly found quality of manufacture, which US autos could not at first make, either.
.
& for years, Mercedes Benz relied upon the brute force approach of needing high maintenance, for their quality upkeep. When in fact, Precision engineering by the Japanese(Nihon) made the Deutsch cars obsolete, too. So they had to build precision engineering to SAVE the maintenance costs which gave them lots of income, in order not to lose sales and money, market share. This is yet another huge example of efficiencies driving the markets, by competition. Who wanted to pay those huge maintenance bills with Mercedes and the others? When the Lexis had very high quality, low maintenance, and would last 100K’s of miles, too.
.
There are many, many factors involved. And for N=/>3, there is NO mathematical way to be sure of those outcomes. This is why computers are used to build cars, because those can efficiently and more quickly than humans integrate by massive sorting, CxSys designs. It must be empirically tested. And that means, IOW, Steve Jobs was good at doing that, both combining utility AND fun in using, which created the hugest successes of any company and any line, the Apple Inc. and the I-phone. How did he do it?
.
Very simply he combined the marketing astutely with the quality of product. No machine, nor math, nor computer can do that. He built then, too, product loyalty. Those are the facts which need to be stated. If we had, as Ulam stated, an advanced mathematics which could, then it’s likely it’d work as well as Jobs, or Iacoca of Chrysler, too. And that’s the magic of Feynman’s diagrams as well. Those quickly did a lot with a little, a few minutes compared to 2 weeks of computations, by hand.
.
The S-curves of growth then give us the means to analyze each product, and by testing in the markets, find out what will work/sell and what will not, because that real S-curve shows us the combo of emotional/marketing factors and quality of product in most cases. The Hula hoops versus the Nissan models, for instance. The first is simply fads/fashions, which have a real place in sales. And the high quality of performance of the Zee cars, which gave them the growth curve and income to become a large company, as well. Toyota has less of the quality but they are still the world’s largest car maker, for those reasons.
.
&  that is the value of the S-curves, too. The more successful a product is, (or a movie) see
.
.
the faster the growth. And the higher the slope of the S-curve, too.
.
Just compare the successes of Spielberg financially by sales, with his competitors. Who made the most money and why? He, as a professional, but not scientifically, figured it out, just like all the other professionals. Math and science were not needed, as in most cases of professional skills and successes, either.
.
That is Polanyi’s “Personal Knowledge” writ large, as well.
.
The S-curve can be created by taking the quarterly profits of Apple, and factoring in the I-phone sales, and then seeing the growth curve, as it speeds up and then reaches very high values, and then finally as at present slows down well past the flexion point of diminishing returns of the S-curve. Apple’s sales and so forth have now reached the S-curve limits to growth. & only Jobs could jump start that Next technology which would create such growth in Apple’s sales, again.
.
A similar pattern can be done with the quarterly growth of the Chinese economy, or that of India, as well. This will give an efficiency model, which the rate of growth increase corresponds to an efficient level combining market value and actual improvements in efficiencies.
.
That’s how it’s done. and the S-curves can be easily adjusted for the estimated slopes/efficiencies rates. Thus if we see a standardized series of curves, from the sales, we can estimate the future growth AND when it will begin to decline. Not exactly, but far, far more than if not. Esp. by simply guessing. this is empirical, scientific testing of the markets, in short.
.
The same curve can be done with the Mustang, the Chrysler SUV, and that of the others, too. Those will grow and then decline in growth rate, and from the outcomes of those, the future sales/profits and long terms sales can be predicted, too.
.
In fact, this is a method of the “Study of History” and Asimov’s Psychohistory in short. The S-curves of growth will tell us what will emerge and become of long term value and what will not. It can predict, detect and foresee emergent systems by their efficiencies which will drive unexpected growths, & which can be developed. The efficiencies of those growth curves are directly proportional to the slope of the growth. the higher the growth, the greater the slope towards vertical. Recall, that it’s almost always (fast immer) the emergence of growth in Cx Sys which makes them unpredictable as to outcome. S-curves evaluations CAN substantially take care of those emergent growth events, by showing they are likely due to LE, S-curves yielding very likely growth.
.
This also applies to the weather, which is full of growth events, like storms, cyclones, tornadoes, blizzards, monsoons and much else. S-curves can be constructed much, much more efficiently to predict those events, which start out not predictably and then grow to highly important outcomes. Indeed, most all Cx Sys can be predicted to some extent, this way. The problems of CxSys & predictability are that of Growth, S-curves embedded in the processes. Once we can see what will likely grow as an S-curve, then it becomes lots more predictable, as well. That is the promise of S-curve mathematics, in short.
.
And the stabilities of the genomes are the same. Why do some species last a very long time in the earth records, such as Limulus, Lingula and the Sequoias & the extraordinary, blue green algae Stromatolites, which have endured for 3-3.5 gigayears? Those are very LE and thus very stable. Having survived that long it’s NOT just chance, and so forth. But real, basic efficiencies, highly built in and identifiable if we but know and can and will look for them. Durability and stablities of genetics is LE efficiencies, clearly, and writ VERY largely.
.
This weighs in heavily and decisively in our own species individual and species longevities. Those whose S/F are highly efficient will live a lot longer. Those species which are very efficient, have also very stable genomes, BECAUSE they are efficiently put together in so many CxSys ways. From the metabolic, to genetic, to cellular, structural, etc. ways. This is how LE reforms &  highly likely extends our understanding of evolution’s processes WAY past competition and survival methods, alone. It’s a substantially extending method without limits, too.
.
This is what’s going on with the S-curves which can be used in this way to predict growth, thus return on sales. Which product to develop and which not to, also. & Fitting those curves to the sales data will give more info about how, every year with new take offs unexpected in products, to very likely being able to predict which products, services will grow, &/or why those will or not sell, either. This is Uber in a nutshell, BTW.
.
This is the high promise of CxSys understanding using the Pentad as above. And it’s doable, just as can antibiotic resistance be successful countered in most all cases, and meds be used to last longer, such as Viagra 50 mg. for nearly 3 days instead of only a few hours.
.
These nearly universal processors, the CP, the LE, the S/F, the CxSys methods, & the many tools, devices, instruments and methods and techniques can do.
.
That’s what’s on the line here. Survival of companies and their products, & the survival of ourselves.
.
The big problem with Moral relativism was that it didn’t factor in Survival, that is durability, that is efficiencies, which create Stabilities!!! It also ignored the Biological imperative of Survival! As It left those out, and survival is very important to both the survival of a species &  to EACH species longevities in this universe. Survival is very likely in fact the most important biological imperative. And if this is missed, by ignoring evolutionary methods to sort through events, then it missed that VIP and has failed.
.
Let us further look at the other points. Those of Emergent phenomena in societies, such as relationships and those cultures which grow. Those of the storms which develop out of events in the weather. Those of the emergences of new species, and new major phyla of plants and animals. It’s those emergent phenomena which make prediction difficult to see. But, and this is the point, the acute, clear nature of Efficiencies in such natural phenomena are what creates those growths, is not?
.
Thus if the S-curves of the growth of storms both tropical, hurricanes, and wild thunderstorms & tornadoes are studied, we will find those growth S-curves developing, driven by LE processes. By finding measuring tools of some relationships it will be found then HOW to predict which storms will come about and which will not. Will the TS become a hurricane? Well, what is the growth curve estimate? If the slope is high enough, the water is warm enough, it will, if not, it will not. Using outcomes studies, of the Monte Carlo kind developed by Ulam, this can be used to create a growth curve, which will correspond to a decision. but time will tell, will it not?
.
We know how, already that TS and hurricanes, typhoons and cyclones will develop in & due to very warm water, during the summers. We also know when they stop. These are empirically founded facts, which are found repeatedly by observations. Thus the repeating events in existence, reinforce themselves in our LTM, and then we find the trends, relationships, & data, which AI cannot find. Once we know how those work, then we can create AI to do a good bit more of that, too.  If we can find & use a highly efficient method to ship out surface warm water to lower levels, and sow 100’s of those in the path of a Hurricane, we can kill it, can we not?
.
For instance, if a way can be found to create an efficient quantum heat engine, the path in front of a hurricane can be sown with 100’s of QHE’s and by cooling the water it moves into, kill the hurricane!!! This is a way to block more devastating hurricanes and if not kill them render them so much less powerful, that most damage, injuries and deaths can be prevented. This is now within our grasp, as well.
.
The LE methods do that. We do NOT have DA drivers in natural phenomena, which make the growth of sales and marketing necessarily more complicated. Natural systems are simpler, clearly and this is empirically shown and highly likely the case. It’s a LOT simpler to predict TS’s than human behaviors. If the products were only sold by their efficiencies, then those would always be winners. But they do NOT. And Jobs understood that.
.
IF we realize, however, that natural CxSys growth virtual ONLY is due to such LE drivers, then those are much more predictable than humans, as well. Thus in the natural systems, growth can be predicted due to measuring LE outputs and drivers. & the weather can be more predictable than say, human history. But BOTH are amenable to those methods. History is least energy, too and very hard to change for that reason. Toynbee has shown that.
.
For instance, we can study the outcomes of laws. If they are enacted with a purpose in mind, then we study by outcomes to see which do that the most efficiently in a society. If they work, they have a high growth and stabilities S-curve. If they do not, then they pass away, and are not valued.
.
Such studies will create an empirical, predictable basis for making Psychohistory a reality, will those not? Why do the laws work or not in  a Cx Sys human society or any animal’s societies? Human emotional systems are good social organizers, but they are NOT very good information processors. They are Cx Sys.  & have rules of their own, which the astute politicians & the great persuaders know how to make them work. That is the nature of Cx Sys solutions to social problems, and economic problems, as well.
.
Why did money for instance drive out barter? It’s very, very clear why. First of all the complexity of the barter. Everything had to be compared to everything else, for values, other goods/services, or food & manufacture, which was related to n! (Factorial) Complexity rules. IF there were 20 items to be sold, then the number of possibilities was N= 20!. Without money in today’s economy, it’d be impossible and ruin sales, take huge amounts of time, and much else. Money is efficient, which is why barter is no longer used very much. It’s that simple. LE. Thus the applications of money in more and more efficient methods, kept on working & kept up economic growth. Its efficiencies were so high it spread rapidly grew and displaced most all barter, is not?
.
How do we create a stable currency in small habitats? That will have to be solved. And it will be done by least energy methods, as well. Guaranteed, too. Currently Bitcoin is the latest financial fad. It’s totally unsupported and fluctuating in value. It cannot compete, efficiently, with modern financial instruments and methods we currently have. Thus it’s a fad, and MUST pass.
.
Barter creates  a huge complexity. but if each item/service can be sold in terms  of a single, common barter element, that is silver coins or gold coins for instance, then the entire trading is highly sped up and simpler is not? & that’s why money drove out  barter!!! and it’s why bad money also dries out good, is not? The other advantages were easier accounting and related figures, at least.
Further, The S-curves of growth determine which religions will succeed and which will not. Which behaviors are better & more easily used or not.
.
Take for instance alphabets VS. hieroglyphic/character writing. With the alphabets there are 20-30 letters for the sounds in varying amounts of overlapping usages. For instance S can be “SSSS”, or “SH, or Z, or even ignored, which creates a complexity of usages and inefficiencies. The amount of visual and word storage is hugely greater, with 1000’s of glyphs/characters to recall. Thus the child cannot read in such languages, until about age 8 or so, where as in alphabets, by 3. 5 years advantage due to that difference alone is decisive. Which is why Deng created the pinyin system as it alphabetized those, and thus make typing and printing, far, far easier, too, as was memory requirements.
.
But English in contrast and comparison to Espagnol, where the letters, consonants &  vowels are most ALWAYS pronounced the same, then it’s way easier than in English where it’s NOT the case. So to increase the efficiency of English, use ONLY those letters which have a single sound. If there are too many letters with toom many sounds for each, it’s complicated, by the N! rule. If there are too few, then there are not enough to make the language efficient.because there must be overlaps are as there in English.
.
The CP and LE guides, including how words are formed using the tongue, teeth and related vocal cord outputs, which is Cx Sys, BTW, can be made More efficient by testing using the S-curves. This then is the CP, LE guide to creating a very, very efficient language, far, far better than any of those at present.
.
We also understand by Esperanto failed. It was not EFFICIENT.
.
Now take the hugest advantage of English. There are NO genders for nouns unless those are biological. In the romance languages and many others there are very arbitrary genders, such as feminine for windows, of all things!!! Those give a greater complexity & memory requirements for  the nouns, but English shows it’s not needed. The more words the more genders & so that quickly becomes a real problem, too. So English has per total number of nouns or like words in a language, about that many fewer La, las, el, Los and such than does Espagnol. English is LOTS simpler to remember! By 100K’s of times over. The storage size for word memory is lots lower by 100K’s of words in modern languages, is not?
.
That’s why English does so well..
.
But, in English the letters can be a LOT more complicated, as C for K, or C, for SH, etc.  The problem is that there is no standard pronunciations for EACH letter in English. the number of ways -ough, -augh, and related can be pronounced is absurdly high. Get rid of those!!!!
.
SpaTial, There, WiTh, StaTionary, and so forth. It’s entirely specific contextual pronunciation, which is very hard for many to follow, too. There are NOT grammatical rules for same. It’s entirely arbitrary & Ad hoc!!! “Special” is yet another of the many complexities of English. There are 100’s of other examples, throughout, no doubt due to English being highly syncretic, being the combination of Anglo Saxon, Welsh, Scot, Irish, German, French, Latin & many others, too. But the efficiencies of English without all those genders is crucial. and English can be easily reformed, as Webster tried to do, by simplifying out those words to a speedier use which like Espagnol Is more standardized.
.
But people do NOT want to learn new things. So the usual brain hardworking, and fossilization of behaviors by LTM lay downs (& habituation and facilitation), is active here. Eventually due to limitations (mostly the huge proliferation of words), it will have to be done, and as in the below Chinese model, it will become far, far more efficient to both write, read, and speak by making those changes. Computers will then be much more easily able to understand it, as well. Which come to think of it is THE way to translate English into an AI usable language.  Simple, elegant, and problem solving. If we want to make voice activated computers more efficient, we must make English more computer user friendly, too. That will drive the reform, as in Pinyin.
.
Simplify and and English will become to much easier to speak & within a few years will soon overwhelm the others, as well. Efficiencies count in usage, Least energy works. and this can be shown with the Com words, which will be explored later, too. Least action energy applied in languages.
.
The problems with Esperanto were essentially inefficiencies which other language did not have. It made each word simple and easy to pronounce. But in almost every case, the words for simple, commonly used events were much longer, more syllables than other languages used. Indeed almost TWICE the lengths. The least energy rules were broken all over by Esperanto!! The most often used words in established highly used languages are far shorter than Esperanto, than those less commonly used. Thus, Esperanto once again failed the complexity, least energy rule. And it failed badly, too. It had gender for most all nouns, and they all ended in “o”  which was stupid. We had to read to the end of the word before knowing it was a noun, whereas “the” articles in “a” language showed that at once!!! & it added unneeded complexity to speaking, unlike English which does not. It was in short, bass ackwards with the way words are processed left to right!! Again, a LE mistake!!!
.
 Moreover, it did not have the richness of associations seen in language. Such as God, Good, Gold;  Devil, Evil, vile, Violence being the opposites to Live. Those deep cognitive relationships are very, very rife in the languages, and Esperanto had none of them. It was not efficient, NOR fun to speak, either.
.
We find the same going on with the Hieroglyphs and characters versus the alphabet. With the alphabet a 3 year old can learn to read. With the H/C methods it take about 8 years to get the information known enough to read & write. And they cannot use a typewriter without less that 5K buttons, ether!!! Which we do with “quertyuiop”. So all office memos have to be handwritten in Japanese, Chinese and elated languages.
.
But when Pinyin was created, which specifically translated in a very standardized, simple way, most all characters into alphabetic 2-3 letter words, the problem was solved. The 2-3 letter could be written or typed and the computer would pull up the character and then they could Type!  But the thing had to process the letters for the character, and that was an inefficiency.
.
So why not go for the gold? ONLY use the Pinyin & ignore the characters. Well, that makes transition hard, because then as 30% of the readers begin to lose their characters, the past history and the many writings already in Chinese are not accessible, either.
.
But it’s coming. Eventually, the alphabets will drive out the characters, and hieroglyphics, too. and by looking at the growth curve of the transitional period, by which Pinyin is replacing the character (about 30% cannot now read characters in the cities), we can predict about when that transition will occur, and then massively grow to 100% pinyin. The S-curve of growth is not?
.
If English is changed to the Espagnol system of only 1 pronunciation for each letter. No C, S, K, SH etc for C. No Th, SH, THUH The), etc. for T. ONLY S for S, then. That will increase the speed of reading, writing & understanding a standard pronunciation as well. Such a system will have a growth rate of itself & can be estimated by using it more widely &  then by comparing that to a growth curve, to see how it works.
.
There was a similar growth curve when in English in American via Webster lost the Harbour, Parlour, “U” from the French which is still used in English in London, today.
.
That can give an idea of how fast it changed, altho it was promoted by the DopAmine system as well, which modified the S-curve effects, too.
.
This gives an example  into into how S-curves can work and be used to understanding prediction better, the nature of growth in CxSys, of all kinds. and is meant as a primer for the creation of S-curve maths, which can begin to predict emergence, how valued new goods and services can become, and what will be the flexion points of their diminishing returns at the tops of the S-curve.
.
&  can be applied to Political &,to new cultural movements in all areas, including the spread of musical forms, such as Pop songs, Hip hop, and the various forms of CW and Latina musicas, as well. It’s a  virtual unlimited source of information about growth and development of all types, as as such can accompany the growth of our understanding of Cx Sys languages, and how those developed, and changed over time. Which very likely, is what understanding more of our universe it’s all about.
.
Another major point is that of “correspondence theory, which is very ancient. And the CP, LE, S/F and complex system plus all the methods, ways of doing things, approaches, (etc) and Techniques and technologies, tools, methods, instruments, etc. is superior to that.
.
It avoids the problems of morality by taking that clearly into the realm of sciences, being studiable by Outcomes research, as well. We know a morality and ethics by their outcomes and those can be very easily observed, studies and brought within the purview of the sciences. So it in fact using CP unites this model and overcomes many of the objections to Correspondence theory, by replacing with a virtual Model of Everything, of which correspondence is a part, but not the whole thing. & this is what’s going on. Giving a neurophysiological, neuroanatomic, and neurochemical basis to the theory is what’s going on. It firmly founds and absorbs a modified Correspondence model into the CP. LE fold, just as it does behaviorism into cognitive neuroscience.
.
And lastly, that’s the beauty of this CP, LE model. It’s, if used well, likely very unifying. Taking us “beyond the absolutes” of correspondences to the more empirical, more likely true relationships among events  (Einstein, “Physics and Reality), which are anything but absolute, and instead, yet unlimitedly studiable and knowable.  We remove the pagan, Platonic idealisms & silly, unreal absolutes & ultimates (the fallacy of idealisms, that all events must correspond exactly to our ideas. & replacethbose by the facts that events are the standards, instead of ideas.) and re-create it simply in a newer, more scientific, testable, and empirical form. This is an escape from puerile, self centered, homocentric, geocentric, regressive behaviors of children to a more adult form. & which can allows us, potentially and nearly universally, to understand aliens, as well as the other animals and plants around us, locally. It comprehends what is within us and what is outside of us, equally well. A fine, nearly universal Model of Everything.
.
.
& Thus do we go Beyond the absolute, Limits to knowledge, & thus incorporate correspondence as one of the many forms of CP and LE words/ideas.  This is protean in it implication, because it also, not only extends Einstein’s epistemologies, into the verbal descriptors and makes it empirical, but it also shows how a few changes, can modernize indeed make far, far more relevant, correspondence theory. Thus, we do much with a little & our knowledge becomes substantially MORE united, as CP and LE are easily able to do, provably and unlmitedly. But not quite!
.