The Limits of Comparison Processing

By Herb Wiggins, M.D.; Clinical Neurosciences; Discoverer/Creator of the Comparison Process/CP Theory/Model; 14 Mar. 2014
Posted 27 Aug. 2018, Copyright 2018.
.

It’s been written many times before what the efficiencies and outputs of the comparison process (CP) are. Vast numbers of events in existence which are very similar to each other, & drive the creation of the categories of Aristoteles, the levels of the hierarchies of our understanding. We can detail and compare members of each of those synonymic and word clusters/phrases to each other, and which give a much more efficient and thorough, tho still not  totally complete, descriptions of events. It creates creativity by setting up standards, conventions, rules, laws, etc., against which we describe events both internally and externally. Those standards are compared to events and the differences and similarity among them in wide gradations, allow us to create information. ROY G BIV is an excellent example. ( Do we have a near absolute sense of colour? As we can near perfect pitch? Not absolute but close enough.) So are high, high, highest; low, lower, lowest; hot, hotter, hottest vs. cold, colder, coldest; and the unlimited panoplies of those linear adjectival systems. But the central one is the comparative form, between the base and superlative forms, which generates most all the rest. And that has been missed.

.
.
Math works much the same by creating scales against which we measure using lengths, heights, hardness, softness, warmth, cold, painful or not; and the pleasures as well, as in the Morphine standard for pain meds, the analgesics. By counting we create information. By measuring against our standards and scales we create numerical information, and that is then imported into our brains/minds for further processing. This process internalizes basic parts of the universe and allows us to further find patterns. Much in the same ways as we do our comparison of verbal descriptions, which are vaster, richer, and far, far more flexible in descriptions. & essentially this shows how we’ve mathematized, creatively, our sensory inputs, as well.
.
Converting the sensations into mathematical, measuring structures:
.
.
CP creates the recognitions by comparing events of all kinds and flavors, feelings, visual colours and the gray scales,and so forth. Some of which, but most which cannot be measured, such as the loves, hates, and etc., which marks the disparities and differences among the measuring scales versus the vastly greater and more flexible description scales used very largely in biology, medicine, and its subclasses of the exams, diagnoses, and reading most all lab tests, radiological scans and much else.
.
.
So we create the recognitions, then the pattern recognitions, pattern recognitions, pattern recognitions of our hierarchies, without limits. But what of the limits of this method? We know that most all methods, devices and scales have their capabilities and yet their limits and the unlimited ranges in between. & it’s a good craftsman, who knows his tools. And those have been detailed to some degree and before, as well.
.
Using this vastly efficient CP series of methods, descriptions and measurements, also shows the limits of them. Just like logic falls to the empirical test of the false dichotomies, of its essential (& maths, too), A or not A. A or B. Hot or cold, and white or black, we see that it misses a LOT of the grays in between those two. It simplifies, and allows to sort by elimination, but misses most all of the rest. Whenever we hear the critique of “oversimplification” we know we’re dealing with a logical limit and likely complex systems. It’s not just Deduction or induction. It’s all of those standards we use, from the moral, to religious, spiritual, legal, mathematical and logical truths, including the empirical logics of the sciences. There are many truths of value, historical and legal, as well, which are closely related. As are the genealogical truths; and the limits to paleontology and archaeology, because data/info decays in time. That is TD, writ large in the limits of our knowledge over times past. And is Essentially WHY the forensic teams get to the crime scene ASAP before more info disappears, as well as managing the crime scene to prevent same, too. It’s TD, again.
.
Thus we have some serious limits to logical methods, which Godel showed very clearly in his INCompleteness Theorem, called by the shorter, euphemism, Godel’s Proof. But is not most all our knowledge incomplete? And yet the issues of sorting problems have yet to be more completely addressed regarding most all problem solving methods. & will be in later articles of this key, deep, point. Hofstadter addressed this highly important point in “Godel, Escher, Bach….”,  the “this statement is not true” created the paradox of incompleteness. The global, but not specific negative did so.
.
 But comparison processing is not a global negative, but an exclusive sort of info processing. Its work is done by innate, implied or clear cut “exclusions” rather than the negatives, and so is far, far more applicable to events in existence, than limited logics. &  that’s why it’s used by most brains, in one form or the other, daily, and without limits. the limits to our ideas and the methods, devices they are based upon are driven by this incompleteness of logics. Which altho CP has limits by exclusion and by events not being comparable, is so far more capable because it uses relationship, the relational  logics, which go way past what logic can do of itself. & indeed drives the other logics, which create the many truths, as well.
.
A detailed examination of this problem of the global negative has already been discussed. The negative, in short is a kind of exclusion, and the global negative, all A or Not A, is of this inadmissible type. Nothing in events in existence is EXACTLY equal to something else. They can be close. But the CP admits this and unless specific exclusions occur, its logic is far more useful & vastly more widely applicable than “A  = B” or “A or Not A”. The Exclusion principle largely disallows those false Identities of logic, which are often the formal logical fallacies, while the CP intrinsically deals with them using similarity, relationships, and so forth, which logic nor math can easily do.
.
So there we have it. if logics and math are incomplete, then much else can be also. If in the “Grand Design” Hawking stated from the first, that our physics is incomplete and is the case. & Bell also stated that QM was not complete, as did Feynman. We cannot solve the QM equations for huge complexities. And Feynman also stated the obvious, but missed a deep truth. We cannot develop biology from QM, he said. Which in modern terms means, complex systems elude treatment with maths, and physics. Ulam stated that math must “greatly Advance” before it’s able to describe/model complex systems. & so that is to this day.
.
Or do they? The process of evolution, or genetics, and the complexities in each of those cannot be easily figured by those. But we have here good starting answers to Bell, Godel, Hawkings and Einstein’s search for a more complete general theory of physics.
.
And it’s very clearly, Least energy, which Einstein alluded to at least elliptically. And how do we recognize least energy conditions? By comparing them to one and the other. The least cost, the least energy, the least materials, the least times, distances, least waste, and the rich panoplies of the 2nd law of Thermodynamics, which more completely describes, as such categories usually do, events in existence, AKA, the complex systems in the universe.
.
.
So there we are with Least energy, the CP, which shows us least energy forms, the structure- function relationships of vast numbers of events in our universe, from the atoms, molecules, organic molecules and much else up the reductionist, hierarchical structure to the brain/mind. Which the latter is best understood by S/F methods. Yet again, CP nearly totally. compares the structures of the brain to their functions, and then back again, and again. thus functional MRI (fMRI) and magneto-EEG’s. And then we compare the outputs of the both, for yet more information!!! Uncannily correct, predictable and we see how the CP creates knowledge. Yet again and again, without limits.
.
But we can still ask this huge question. if our knowledge is incomplete, then the main driver of our info/data creating, the CP must also be incomplete. &  knowing that, which in thermodynamics terms combined into Shannon’s IT, we know this. There is NO perfect heat engine. It’s not possible to perfectly use energy sources for doing things with work. Thus complete descriptions are not possible, either, because it means in TD terms, perfect descriptions. Incompleteness of many types makes that very difficult to create. Thus there is a clear, almost blatant TD relationship of our knowledge to “incompleteness” discussions and outcomes.
.
But we know that LE creates growth in many unlimited instances. And there are the ways out. Inventions &  devices and indeed cultures & ways of doing things can be improved, that is, made more LE efficient. & this is what drives growth from gravitational bodies getting large and larger (Einstein’s compound interest is the most powerful force in the Universe, ), to evolution, to market efficiencies (least energy) of Adam Smith driving most all market growth, and so forth. Building a better mousetrap is quintessentially a least energy form, is not?
.
.
And that’s the whole point. What do we run up against in most all cases?  We have the S-curves of Whitehead’s Process Thinking methods to thank for that. Growth occurs in additive and exponential curves, of which the S-curves are the unlimited, basic forms. And almost anything which does not break us out of, or jogs our society’s current abstractions, after a limited period of growth, will tend to stagnate. Whitehead’s prescient, verbal form of the S-curve, made real, and mathematical. Showing yet again how descriptions of many kinds can be converted into math, and in this case most all growth, yields math creativity.
.
But the limits of finding primes, are basically the most efficient and fastest methods to find, sort out, but not generate the primes. And the best ones are used, again, least energy forms of each when compared to another. How fast does it sort out the primes? Again, showing the LE kinds of sorting problems we get into and escape from. The facts are those can be done more and more efficiently without limits. But we MUST break out of our “current abstractions”, standards, conventions, etc., to do so. And that’s how it’s done. To understand that our methods have limits, and then exceed, circumvent and go “round the impasse”. Creatively. Read the article about the Wiggins’ Prime Sieve, in both the first & then latest, more developed, more complete and more efficient forms, to see this at work, empirically and mathematically. & can be improved without limits.
.
This essentially is what’s going on with CP. using linear methods, or lines, we reach limits; light speed, absolute zero, two exponential barriers at either end of the Two connected S-curves of the energies of fermions.  Which Einstein explored rather well, in which, at least conceptually, our universe dwells. Again, Einstein’s Great Subtleties.
.
.
So we know that the exponential barriers, and the perfection barriers of TD, and the HUP barriers to knowledge, and the most of the rest are simply CP limits, and not always real. By creativity we can jump over those barriers in many, many instances, without limits. Is there a limit?
.
Yes, in the brain. Which creates by using the CP the expon bars, on in the case of math primes, the asymptotic limits. Mistaking our ideas/words and logics for events in existence is the fallacy of idealism, in a nutshell. And we have no real reason to believe that there are no ways around light speed, such as Quantum Tunneling, and around Zero K, such as negative energies. And nor do bosons have all those limits, either, as photons show. & as matter under Bose-Einstein conditions DOES become more bosonic, too. So there are ways “around” the limits, clearly. and likely without limits, too.
.
The CP process limits are clear, the expon bars, and the asymptotes, and the idealistic forms of perfect heat engines. Without those systems, we are trapped, but not forever, as Whitehead showed. &  as our progress as a species has almost always showed, esp in these days of exponential growth of our population, which must come to an end, after a limited period of growth. Until the physical limits of living on our planet instead of most everywhere else, too, must be addressed & are addressed, we will find those “limits to growth”, yet again.
.
Before growth resumes after finding new & better performing S-curve of growth, too.
.
The limits are between our ears, and the limits to our memories and processing of information, mostly. The limits to logics simply mirror that we cannot compare apples and eggs, very well. So we create the means to measure, and describe each of those. But taste has been ignored.
.
“An Infinity of Flavors?”, shows a new approach to those limits and the possibility of mathematizing flavors and smell, too.
.
So while our limits are real, Because the universe of vast events does not go as the Big Pot into the  Little Pots, our minds/brains, there is yet a LOT of room for growth, creatively. & this is the problem of the Land of the Undiscovered Country, too. and such is the case, very likely.
.
Every generation exceed the limits of its past, because the universe is so very large, and our minds/brains are so very small, by comparison. We give up the idealistic illusions and fallacies of ultimates, finalities, absolutes, certainties and perfections, for the realistic probabilities of complex systems, which promise literally growth without limits. Thus most all our limits are not real, but simply way stations to better methods and ways of doing things, universally, but not quite.
.
Beyond the Absolute:
.
&  that’s the way it very likely is, over most all fields and endeavors, but not quite……
Advertisements

Table of Contents: La Chanson Sans Fin

1. The Comparison Process, Introduction, Pt. 1
https://jochesh00.wordpress.com/2014/02/14/le-chanson-sans-fin-the-comparison-process-introduction/?relatedposts_hit=1&relatedposts_origin=22&relatedposts_position=0

2. The Comparison Process, Introduction, Pt. 2
https://jochesh00.wordpress.com/2014/02/14/le-chanson-sans-fin-the-comparison-process-pt-2/?relatedposts_hit=1&relatedposts_origin=3&relatedposts_position=1

3. The Comparison Process, Introduction, Pt. 3
https://jochesh00.wordpress.com/2014/02/15/le-chanson-sans-fin-the-comparison-process-pt-3/?relatedposts_hit=1&relatedposts_origin=7&relatedposts_position=0

3A.. Extensions & Applications, parts 1 & 2.

https://jochesh00.wordpress.com/2016/05/17/extensions-applications-pts-1-2/

4. The Comparison Process, The Explananda 1
https://jochesh00.wordpress.com/2014/02/28/the-comparison-process-explananda-pt-1/

5. The Comparison Process, The Explananda 2
https://jochesh00.wordpress.com/2014/02/28/the-comparison-process-explananda-pt-2/

6. The Comparison Process, The Explananda 3
https://jochesh00.wordpress.com/2014/03/04/comparison-process-explananda-pt-3/?relatedposts_hit=1&relatedposts_origin=17&relatedposts_position=1

7. The Comparison Process, The Explananda 4
https://jochesh00.wordpress.com/2014/03/15/the-comparison-process-comp-explananda-4/?relatedposts_hit=1&relatedposts_origin=38&relatedposts_position=0

8. The Comparison Process, The Explananda 5: Cosmology
https://jochesh00.wordpress.com/2014/03/15/cosmology-and-the-comparison-process-comp-explananda-5/

9. AI and the Comparison Process
https://jochesh00.wordpress.com/2014/03/20/artificial-intelligence-ai-and-the-comparison-process-comp/

10. Optical and Sensory Illusions, Creativity and the Comparison Process (COMP)
https://jochesh00.wordpress.com/2014/03/06/opticalsensory-illusions-creativity-the-comp/

11. The Emotional Continuum: Exploring Emotions with the Comparison Process
https://jochesh00.wordpress.com/2014/04/02/the-emotional-continuum-exploring-emotions/

12. Depths within Depths: the Nested Great Mysteries
https://jochesh00.wordpress.com/2014/04/14/depths-within-depths-the-nested-great-mysteries/

13. Language/Math, Description/Measurement, Least Energy Principle and AI
https://jochesh00.wordpress.com/2014/04/09/languagemath-descriptionmeasurement-least-energy-principle-and-ai/

14. The Continua, Yin/Yang, Dualities; Creativity and Prediction
https://jochesh00.wordpress.com/2014/04/21/the-continua-yinyang-dualities-creativity-and-prediction/

15. Empirical Introspection and the Comparison Process
https://jochesh00.wordpress.com/2014/04/24/81/

16. The Spark of Life and the Soul of Wit
https://jochesh00.wordpress.com/2014/04/30/the-spark-of-life-and-the-soul-of-wit/

17. The Praxis: Use of Cortical Evoked Responses (CER), functional MRI (fMRI), Magnetic Electroencephalography (MEG), and Magnetic Stimulation of brain (MagStim) to investigate recognition, creativity and the Comparison Process

https://jochesh00.wordpress.com/2014/05/16/the-praxis/

18. A Field Trip into the Mind

https://jochesh00.wordpress.com/2014/05/21/106/

19. Complex Systems, Boundary Events and Hierarchies

https://jochesh00.wordpress.com/2014/06/11/complex-systems-boundary-events-and-hierarchies/

20. The Relativity of the Cortex: The Mind/Brain Interface

https://jochesh00.wordpress.com/2014/07/02/the-relativity-of-the-cortex-the-mindbrain-interface/

21. How to Cure Diabetes (AODM type 2)
https://jochesh00.wordpress.com/2014/07/18/how-to-cure-diabetes-aodm-2/

22. Dealing with Sociopaths, Terrorists and Riots

https://jochesh00.wordpress.com/2014/08/12/dealing-with-sociopaths-terrorists-and-riots/

23. Beyond the Absolute: The Limits to Knowledge

https://jochesh00.wordpress.com/2014/09/03/beyond-the-absolute-limits-to-knowledge/

24  Imaging the Conscience.

https://jochesh00.wordpress.com/2014/10/20/imaging-the-conscience/

25. The Comparison Process: Creativity, and Linguistics. Analyzing a Movie

https://jochesh00.wordpress.com/2015/03/24/comparison-process-creativity-and-linguistics-analyzing-a-movie/

26. A Mother’s Wisdom

https://jochesh00.wordpress.com/2015/06/03/a-mothers-wisdom/

27. The Fox and the Hedgehog

https://jochesh00.wordpress.com/2015/06/19/the-fox-the-hedgehog/

28. Sequoias, Parkinson’s and Space Sickness.

https://jochesh00.wordpress.com/2015/07/17/sequoias-parkinsons-and-space-sickness/

29. Evolution, growth, & Development: A Deeper Understanding.

https://jochesh00.wordpress.com/2015/09/01/evolution-growth-development-a-deeper-understanding/

30. Explanandum 6: Understanding Complex Systems

https://jochesh00.wordpress.com/2015/09/08/explandum-6-understanding-complex-systems/

31. The Promised Land of the Undiscovered Country: Towards Universal Understanding

https://jochesh00.wordpress.com/2015/09/28/the-promised-land-of-the-undiscovered-country-towards-universal-understanding-2/

32. The Power of Proliferation

https://jochesh00.wordpress.com/2015/10/02/the-power-of-proliferation/

33. A Field Trip into our Understanding

https://jochesh00.wordpress.com/2015/11/03/a-field-trip-into-our-understanding/

34.  Extensions & applications: Pts. 1 & 2.

https://jochesh00.wordpress.com/2016/05/17/extensions-applications-pts-1-2/

(35. A Hierarchical Turing Test for General AI, this was deleted after being posted, and it’s not known how it occurred.)

https://jochesh00.wordpress.com/2016/05/17/extensions-applications-pts-1-2/

35. The Structure of Color Vision

https://jochesh00.wordpress.com/2016/06/11/the-structure-of-color-vision/

36. La Chanson Sans Fin:   Table of Contents

https://jochesh00.wordpress.com/2015/09/28/le-chanson-sans-fin-table-of-contents-2/

37. The Structure of Color Vision

https://jochesh00.wordpress.com/2016/06/16/the-structure-of-color-vision-2/

38. Stabilities, Repetitions, and Confirmability

https://jochesh00.wordpress.com/2016/06/30/stabilities-repetitions-confirmability/

39. The Balanced Brain

https://jochesh00.wordpress.com/2016/07/08/the-balanced-brain/

40. The Limits to Linear Thinking & Methods

https://jochesh00.wordpress.com/2016/07/10/the-limits-to-linear-thinking-methods/

.

41. Melding Cognitive Neuroscience & Behaviorism

https://jochesh00.wordpress.com/2016/11/19/melding-cognitive-neuroscience-behaviorism/

42. An Hierarchical Turing Test for AI

https://jochesh00.wordpress.com/2016/12/02/an-hierarchical-turing-test-for-ai/

43.  Do Neutron Stars develop into White Dwarfs by Mass Loss?https://jochesh00.wordpress.com/2017/02/08/do-neutron-stars-develop-into-white-dwarfs-by-mass-loss/

44. An Infinity of Flavors ?                             https://jochesh00.wordpress.com/2017/02/16/an-infinity-of-flavors/

45. The Origin of Infomration & Understanding; and the Wellsprings of Creativity

https://jochesh00.wordpress.com/2017/04/01/origins-of-information-understanding/

46. The Complex System of the Second Law of Thermodynamics

https://jochesh00.wordpress.com/2017/04/22/the-complex-system-of-the-second-law-of-thermodynamics/

47. How Physicians Create New Information

https://jochesh00.wordpress.com/2017/05/01/how-physicians-create-new-information/

48. An Hierarchical Turing Test for AI

https://jochesh00.wordpress.com/2017/05/20/an-hierarchical-turing-test-for-ai-2/

49. The Neuroscience of Problem Solving

https://jochesh00.wordpress.com/2017/05/27/the-neuroscience-of-problem-solving/

50. A Standard Method to Understand Neurochemistry’s Complexities

https://jochesh00.wordpress.com/2017/05/30/a-standard-method-to-understand-neurochemistrys-complexities/

51. Problem Solving for Self Driving Cars: a Model.

https://jochesh00.wordpress.com/2017/06/10/problem-solving-for-self-driving-cars-a-model/

52. A Trio of Relationships and Connections

https://jochesh00.wordpress.com/2017/08/04/a-trio-of-relationships-connections/

53: Einstein’s Great Subtleties:  Einstein’s Edge

https://wordpress.com/post/jochesh00.wordpress.com/583

54. The Problem of Solving P not Equal to NP

https://jochesh00.wordpress.com/2018/04/28/the-problem-of-solving-p-not-equal-to-np/

55. How to Create a Blue Rose

https://jochesh00.wordpress.com/2018/06/02/how-to-create-a-blue-rose/

56. The Etymologies of Creativity

https://jochesh00.wordpress.com/2018/06/14/the-etymologies-creativity/

57.  A Basic Model of a Unifying System of Most All Knowledge

https://jochesh00.wordpress.com/2018/07/06/a-basic-model-of-a-unifying-system-of-most-all-knowledge/

58. Understanding Psych with S/F Brain Methods

https://jochesh00.wordpress.com/2018/07/11/understanding-psychology-with-s-f-methods/

59. The Wiggins Prime Sieve

https://jochesh00.wordpress.com/2018/08/02/the-wiggins-prime-sieve/

60. The Complex System of Love

https://jochesh00.wordpress.com/2018/08/22/the-complex-system-of-love/

61. The Limits of the Comparison Process

https://jochesh00.wordpress.com/2018/08/27/the-limits-of-comparison-processing/

62.  The Bees, Cortical brain structure, Einstein’s Brain, etc.

jochesh00.wordpress.com/2018/09/14/the-bees-cortical-brain-structures-einsteins-brain-the-flowers/

 

63. The Wiggins Prime Sieve, version 3.

https://jochesh00.wordpress.com/2018/09/15/the-wiggins-prime-sieve-version-3/

64. The Prime Quartets Method

https://jochesh00.wordpress.com/2018/10/04/prime-quartets-method-capabilities-insights-sans-limits/

65. Is Goldbach’s Conjecture True And/or False, Conditionally?

https://jochesh00.wordpress.com/2018/11/17/is-goldbachs-conjecture-true-and-or-false-conditionally/

 

The Complex System of Love

Written by Herb Wiggins, MD, Clinical Neurosciences; Creator/discoverer of the Comparison Process;  22 Aug. 2018
The Complex System of Love, AKA the sociability of the species
.
This large concept drives most sociable species from the herds, to the vast flocks of birds, to the schooling of the species, even the aggregations of the anemones in the intertidal zones. The wintering of the Coccinellidae species which are of like species. Likewise the migration by landmarks  of overwintering of the Monarch butterflies in vast groups, as they fly south to avoid winter, some staying in California, and many into the Sierra Madre Oriental of northern Mexico. From the migration of the beasts together with the great wildebeest migrations in Africa, to that of the elephants, bison and most ungulates, each using their standard recognitions and learning of the landmarks, which get then around. And even the great whales, Orca & porpoises in their pods are very similarly enjoined and conjoined by such associations. Like moves with like. Like knows like and that strongly implies comparison processings at work.
.
We humans by comparative ethologies and behaviors are united in this way to the animals, and even the plants, which tend to grow in groves, as the Sequoias do, as the beech/maple forests of temperate North Am, have done, and these are very ancient behaviors, as the beech forests of ancient Antarctica. show, as the palms do, and the calamites forests of the vast ancient coal measures also show us. The congregations of the species are so very widespread over most all spaces and times, &  unifies us all, because it’s there without limits.
.
But in animals, and esp. primates it’s reached a development which altho not exactly the same as the other animals and plants, is yet widely seen and very great. The enormous cities and metropolitan areas of humans have extended this tendency to vast sizes and impacts upon most all life. And yet it’s all quite simple to understand, if the necessary concepts and Neuroscientific knowledge is used. Towards a complex system understanding of our universe.  The Promised Land of the Undiscovered Country
.
.
But what of love? Well, early on we learn of Eros, philias and Agape, a short form of one of the categories of Aristoteles, which lists the related/associated (comparison Process (CP) forms of events in existence and allows to understand, and comprehend by these comparison process relationships, associations and the methods/models of Hierarchical organizations which naturally spring, efficiently, from CP.
.
Thus we extend the trinity of the above loves to a more complete category of the loves. The Maternitas, the love of a mother for her children, which raises them to their best outcomes, if properly working. The love of a man for a woman, or many women. The love of brothers & family, and friends, the larger philias. The love of people for their nations & cultures, AKA patriotism. The love of persons for pets, homes, and other “keepsakes” & forms of sentimentality of the unlimited kinds.
.
And then we find their antonymics of the above loves, the Synonymics. Of these kinds of loves, of all kinds, and not just single words, but the larger phrases which are also synonymimics. The hates, the griefs and grieving, sadness, the feelings of loss, the homesicknesses, and much else. But that’s another category and will not address this, altho important to understanding loves and the sociabilities of human kinds. That which we see in in our near, and close ancestors, and contemporaries, as well as in the earliest human groups and forebears. This commonality unites us all.
.
 But how to explain these complex behaviors of love? And it’s quite, quite simple, and unites both behaviorism with modern, complex system cognitive neurosciences, the latter, the purview of almost all human mind/brain outputs. Very simply loves of all kinds are conditioned responses created by dopamine and the catecholamines, very ancient neurohemicals which have addressed before. & this dopamine euphoria enhances Long Term Memory lay downs.
.
The Balanced Brain, and the DA Standard for Understanding Neurochemistry.
.
.
The reinforcing, “feel good” agents are the parent, Dopamine (DA), And adrenaline & norepinephrine. The endogenous opiates, and the like, such as the endorphins, in all their rich, panoplies also are implicated in this. The bonding neurochemical, oxytocin, which seems to play a major role in bonding of mother to child, and in other forms of relationships, as well. The DA euphoria response of the “Eureka” finding of Archimedes, acts yet again as the Pleasure of Philosophy of Aristoteles. The Love of learning, yet other aspects of much the same set of events in the huge complex system, neuroscientific category.
.
&  thus we have it. Anything using DA and related systems, by those same relationships can create a positive reinforcement, and this is how we unite the Neurosciences, once again with behaviorism.
.
Melding Behaviorism with Neurosciences article, here.
.
.
That positive reinforcement is the basis of the bonding and establishes the basic roles of neurophysiology with the loves of every kinds. It’s very reinforcing and when it occurs, whether with the basic classical conditioning of Pavlov, which is likely seen in most all animals of the higher kinds and some of the lower, as well. It’s the reinforcing “pleasure principle” of Freud, as well. And we observe the rich panoplies of this everywhere, in this Category of Loves, showing the rich power of Aristoteles’ conception and its near universal value. & why he was so successful and great. It was a form of understanindg nad managing complex systems. And so works today, ever more so. And it show also how we create by those means, the hierarchies of our understanding, too. For each subtype of the Tree of life, contains, the many species and variations, the root parts, the categorizes which are seen at each level, from which all the rest is built.
.
The egosyntonic effect is bonding, reinforcing and is largely dopamine driven and created. If it feels good, the animal is likely to do it again. If it tastes good, the gustatory sense and the sensations of fullness of the stomach will reinforce eating. We like to eat. Or we die. And the Endorphin outputs of the stomach are well known, large and highly Established. It can give us pleasure, too. Or enhance sleep as the opiates do. Thus we see yet more of the same larger model of likes and loves. “He loves to eat.” It has both a survival aspect as well as driving the culinary arts. It’s universal in our species, where neurophysiology is working properly. The Stomach is so richly innervated and hugely synthesizes such “feel good” chemicals, that it’s almost a truism.
.
We like wit and humor, and each of these are DA boosted as the parent, ancient neurochemical upon which the actions and traits are base, in all the myriads, complex system ways. And the humor makes us laugh, and we tell that joke again and yet again to others, and they to others, & so it goes viral, or exponentiates and spready widely. & so the “memes” of Dawkins spread, too. This is the great integration of our understanding of what’s going on not just in our species, but in most of the species.
.
And so there that is. The egosyntonic boost the narcissist uses for self stimulation with most of his actions, to reinforce and continue them, is also DA based, too. The very loves of most all kinds show this same egosyntonic boost, which the behaviors of those who feel friendly to others, those who love others, or homes, pets, and all of the sentimentalities , of those, as well.
.
In nearly every case, this basic euphoria created by love becomes a self-sustaining part of our behaviors. And eventually brain hardwiring creates it, stable and self sustaining. Look at two lovers, and observe this. The scores of small kindnesses given to each other, daily. The behaviors towards each other, occurring every day. The detailed, large number of methods and techniques which makes love last are all DA and related neurochemicals acting. He looks at her. She smiles back. He puts a hand on her arm, & may squeeze it. She smiles more broadly & then speaks of her love for him, and gets that back in return. Both smiling. That Euphoriant emotion of love and affection. They may kiss.This is how loves, friends, maternitas, and all the rest work. The scores of kindnesses all DA boost their feelings for each other and sustain and stabilize them.
.
Humor, wit, comedy ALL act in these same, universal ways, and this simplifies the entire categories of loves, pleasures, the affections, and behaviors all down to a simple widely,  deeply acting mechanisms, of many, many in fact unlimited types. “Oh, Honey, that’s our Song!” easily ties in musics to this simple, essential behavior set.
.
From the simple DA boost, and related neurochemicals and so forth, to the conditionings and then eventual Long Term Memory of dendritic proliferations and synaptic lay downs, creating the brain’s hardwiring of the LTM’s of Dr. Wilder Penfield of McGill University. It’s all closely related, different, yet with a common basis. Just like with the Protons, the electrons,which make the neutrons, and then all of the rich complexity of the atoms, elements and vast, unending isotopes, from the simplicity of that trio of basic particles, forming fermionic matter, the richness and complexity of the periodic chart of elements..
.
Thus do we generate the Loves of humans for each and understand it better. This vast unifying model is quintessential to our understanding of most human behaviors as well as those oll the other animals and most plants, too. These reinforcing, positive outputs, growth creating skills. In order to understand something, Feynman states so clinically deeply and comprehending, “If I want to understand an event, I must be able to generate it.” & how we create the protons, electrons and neutrons to create/generate matter, is not that much different than how we create neuroscientific understanding, OR the rest of the hierarchies of our understanding. Just the variations on the same theme, La Chanson San Fin.
.
This shows we generate a deep, wide, simplification, elegant understanding of not just one love, but the many manifestations of those loves commonly generated, without limit. We explain a very great deal with a little, including behaviorism, as well as schooling, herding and the societies of humans of all kinds with these simple basic rules. & explains a very great deal with a little, which is the essential of least energy and useful, growing creativity, too.
.
And that’s how it’s done. And the details and studying the specifics to multiply confirm his model can proceed without limits. And our esp. thanks to Dawkins, and Freud, and the many others who showed us that such very widespread types, actions and behaviors had common origins and could create growth without limits, using the positive reinforcements of the Behaviorism, which are explained there and then are melded and become part into our neuroscientific models.
.
And how we learn and that is intimately related to how we love and learn to love in all the rich panoplies of the lovers, from the maternitas, to the brotherly love in the families and among friends, to the man/woman loves which are essential to human survival and generational continuity, and so forth. & the evidences for this are without limit and confirming in most all
cases, with a very great deal to discover, too. The Promised Land of complex systems.
.
But that’s the complex system modelling, which we have: the LTM being least energy which creates the bonding, continuing reinforcements, and thus the stabilities and persistences of such love driven, associations as well.
.
Simple, elegant, explanatory without limits, predicting new findings, as being fruitful and almost universally (but not quite) applicable; and quite, quite capable of being confirmed without limits, too. We learn to love and we love to learn. It’s that simple.
.
Could it possibly be that simple and explain that much? Indeed, very likely.
.
Least Energy Rules.

The Wiggins Prime Sieve:

The Wiggins Prime Sieve, or How Creating Math Creativity Sorts The Primes
.
By Herb Wiggins, M.D.; Clinical Neurosciences; Discoverer/Creator of the Comparison Process/CP Theory/Model; 14 Mar. 2014
.
Simplify, Simplfy, Simplify
.
Efficiency, Efficiency, Efficiency
.
Least energy, Least Energy, LE Rules
.
This article is written with many goals. This article shows how mathematical creativity comes about, with protean consequences to solving NP not = P. It shows how Comparisons Process (CP) creates descriptive, basic information, and data, and how THAT information is sorted by Least Energy methods to create an efficient Prime sorter. This is the essence of the new complex system, cognitive neuroscience models, which uses nearly or at least far, far more universal processors, CP, and LE and the many Methods of Comparison (Dr. Paul B. Stark, UC Berkeley)  which create mathematical creativity. Step by step. This new model creates creativity without limit, and enlightens us as to much more of Where math creativity comes from, and HOW to create methods of prime sorting with higher and higher efficiencies, literally without limit. This article will make those who use the RSA for security, very, very nervous!!!
.
Read on to find the wellsprings of math creativity (& indeed creativity of nearly ANY kind in any field) and why and HOW Thermodynamics drives information theory and HOW and why IT works. Then most will have found some of the water of math creativity, but NOT the well from which it comes.
.
This is an efficient method to sieve out 75% of the Number line & which generates the Prime Multiples (PrM) & by exclusion/process of elimination the Primes.  These beginning paragraphs set the stage & background for understanding and the empirical methods which create empirical mathematics, and its applications.
.
The basis of problem solving and creation of understanding is complex system, and thus very complicated. However, it’s manageable by the process of creating the simplicities which create/generate the complexities. We move from the simple to the complex, as a rule. Or in this case from the easily found primes to the primes with very long numbers of digits.
.
For instance in understanding the complexity of cloud formations we must understand the three basic wave fronts of the weather. The Northerly dry, cooler fronts, the Southerly blowing warm, wet fronts, and these imposed on the prevailing Westerlies. We see these complex system stabilities, and then use those pattern recognitions to solve weather problems.
.
Then we begin to see the repeating, complex interference patterns of the clouds as part and parcel of these 3 interfering, interacting waves fronts. We see bits and pieces of the waves, clearly from the West, the North and the South. And then we see the triangles of the cloud complex interference patterns. And thus can generally see the SW winds, the NW winds and those coming by combinations of the 3. From the simple fundamentals, we can develop an understanding of the complexity of the whole
.
IN the same way the complex system Plate Tectonics is composed of 5 simple patterns, by pattern recogntiopns, to describe by words, but not mathematized/or mathematizable, the multiple complex systems of the earth’s surface of most all the plates. The plates, compared to each other, have these characteristics. There is sea floor spreading, and of various types, directions and so forth, but largely the Mid Atlantic spreading zones drive the westward movement of the Americas from the splitting off of the Americas from The Euro-Asian continents and Africa. So the American plated moves west and plows into the Pacific plate. That creates the next major fundamental pattern, subduction of the plates under the Western edges of the Americas. Which then creates the greatest quakes, and the volcanoes all along the Western Americas.
.
Then there are the fault movements accompanying these complex movements, esp. shown by the San Andreas which is moving mostly northwards, carrying a chunk of the Pacific plate, the most basic element of Tectonics, the plates, northwards. There are many kinds of faults, a sort of hierarchy of them. Those are the great simple patternings which lead to understanding of the whole, but not completely.
.
Then there is the hot spot which created the many basaltic lava flows in the Hawaiian isles, the Yellowstone hot spot from the Columbia River, & flows, etc.; the Deccan traps, and the Siberian traps. Thus the most of the complex plate interactions are described, but not easily predictably, by this model, which is descriptive & thus mathematically difficult to model. As Ulam stated Math must greatly advance in order to describe/model complex systems.
.
The problem of finding, sorting to find the primes is just one of a long series of sorting problems which are unlimited in finding solutions to problems, a grand sort of NP not equal to P. If we can solve better and better this problem of creating Prime Multiples (PrM) which when compared to the number line which leaves the primes, & can necessarily exclude, & show the primes. We cannot Generate the primes. We can ONLY find them by excluding all numbers which are Prime Multiples. Those two exclude each other. A Prime cannot be a PrM, and the converse is also true. Therefore by finding a method which generates, creates and finds all possible PrM’s, then by exclusion, an efficient sorting process, we can find all the primes. & the consequences of that, are not just for math, alone. And this is yet another way to do it. An Eratosthenes Sieve, as modified by Paul Pritchard, and then greatly extended, sped up here.
.
Now we go to Gauss’ Razor. Mathematical systems should ONLY be created for practical purposes such as in engineering and the sciences, etc. or for those which show us how mathematics works. The rest of it is so often fantasy, that Gauss rid those of inutility by his simple dictum.He sorted out the wheat from the chaff. & made math more efficient.
.
The Razor is a form of empirical tester, sorter for mathematics. As we know Godel showed that math was incomplete. The Razor helps us complete that by testing, empirical sorting methods, which are least energy efficient by that sorting by T&E. That is what creates math by a creative finding, or creation of math to more precisely describe events. This has only begun, because complex systems cannot be describe very well, Mathematically.  As Ulam affirmed, math must greatly advance before it can describe complex systems.
.
The empirical testing which Gauss’ Razor relies upon creates empirical methods to apply maths. It also understands that understanding maths are needed, too. And so he cut through the mass of math generations of too many useless types, to ONLY those which had a real, practical application, by sorting to show where those worked. As we know, the Lyapunov numbers model to some extent, but not completely (Godel), the least energy stabilities. Brain processes create “consciousness, as Friston’s paper in Aeon.com essays showed, very rigorously. That is a fine example of empirical, creative application of maths. Same is true of the S-curves of Whitehead, based upon what has been written before ( S-curves, least energy discussions).
.
The rule applies to the not parallel line axiom which created the Not Euclidean geometries. But LOOK!  Spherical geometry is NOT Euclidean, because using the longitudes is not a Euclidean geometry. No parallel longitudinal lines!!! AND further, uses comparison of circumference to diameter & created THAT new method which solved the problems in spherical geometries, neatly. Information WAS added to create the new method, I.E. Pi. & by that example, the not Euclidean, 4D maths were created by at least 5 mathematicians, independently & used to model Einsteinian 4D space time. Again, pattern recognition. independently creating as in the Calculus, another useful form of mathematics.  Creation of Pi created the new math!!!
.
The general rule is that our CNS works via Comparison processes to create recognitions, and from those create the pattern recognition by detecting the repeated events in existence created by complex systems. Ulam, Fermi and Pasta showed from the first, that the repeating, stable events in complex systems are those which allow us to understand, control and navigate those. The rule is Comparison Processing creates recognitions. Then pattern recognition shows the relationships among those repeating stabilities (Einstein, Physics and Reality, 1938). & then pattern recognition, pattern recognition and then Pattern recognition without many limits. Those create the efficient categories of Aristoteles, which create the highly efficient, least energy hierarchical arrangements of our knowledge in most all cases. From the simple CP to the complex. From the simple number line, to the simple repeating PrM’s, we can do the job.
.
Therefore this new, efficient Prime sorter using Prime Multiples (PrM) to do the work, and it works, using basic math sieving, empirical means, PLUS the PrM to complete it.
.
Essentially we must find, exclude the PrM’s to sort out the primes on the number lines. This must start with 1 and work up. First we note that every number ending in 2,  or is even, except for 2, is not prime. Then we must note the same for 3, and that creates the casting out method of 2, 3’s.  Then we note that 2X5 creates in all of its forms, those numbers which end in -0 and -5. Thus we can more quickly sieve out 60% of the PrM’s to give us the primes.
Then we note this further, efficient to use pattern: All primes numbers end in 1, 3, 7, 9. Thus we can begin sieve OUT by using this method, all Prime Multiples.
This creates the first set of primes, 1, 2, 3, 5, 7.  We then use those sorting rules to create the next Decade of primes, 10, ends in zero, not prime, but is PrM. 11, 13, 17, 19. And we cannot generate any of that quartet using a prime. Those are the remainders from the exclusion of the PrM’s. So we have then 21, 23, 27, 29. Using the cast out 3’s method efficient sieve, we are left with 23 and 29. Those are primes, because there are NO PrM’s generated by our tables which are those.
.
Thus we have the original 3, 7, prime multiples, then more complete, 11, 13, 17, 19, primes to create the prime multiples (PrM’s). And then we enter those into the 20’s decade, & find 23, and 29.
.
Then we enter those into the 31, 33, 37, 39, prime quartet in the 30’s decade, sieve out the 33 & 39 as 3’s by adding up the digits. Then we know we have 31 and 37. And using this same method without limit, in every decade and every centad (100 numbers), we can sieve out down to only 40 digits which is a 60% reduction in the search size. Using the prime 3 cast out system, by summing digits, efficiently, we can eliminate 13 to 14(most commonly) more of each quartet. Then we use the PrM generating tables to show which of that series are NOT prime, leaving by exclusion, sorting out and in a complementary pattern, the Primes.
.
Same can be done by hand in about 15′ if efficient, the number line from 100 to 200, finding all the primes by this method too. But understanding that the squares of primes times all the primes is needed to further exclude all of the PrM. it uses no more, and rids about 5% of the remainder of possible primes. But because those numbers rise very quickly, there are not many of them, it’s simply that last bit of PrM elimination to get the primes.
.
.
Then we carry out that method without limit, generating the Prime Multiples, which by comparing them to the quartets of each decade, 40 per 100, sorting, and excluding them from the number line, Thus leaving the primes.
.
.
That’s how it’s done. See the next section of the PrM tables which do this, efficiently.
.
.
Thus most all of the work is done and this empirical, efficient systems, works. It’s been checked through 2000 to show the ins and outs of it.
.
.,
Understand these basics. We Cannot generate the primes, but the pattern of generation of the PrM’s is the key to the pattern of the primes. We use the discovered primes to quickly generate a great many more primes, which keeps the system going and propels finding more of the primes. It’s accurate, can be done very efficiently by noting many simplifying rules to generate & ID all known PrM’s. and that’s the general method.  Thus the primes found, efficiently sort the number line to FIND the rest primes. Sounds like impossible bootstrapping, but empirically we are generating a lot more primes in each centad, than we need to find the primes about 4-10 centads ahead, and so forth. But we feed back in the primes sorted out by exclusion of PrM’s which then allows further work to be done. IN short, the Primes are used to sort out the primes.
.
.
The Primes CANNOT be generated at all by arithmetic, but are exclusions and that is not an arithmetic rule, either, but is sorting, which is something quite different from simple mathematics. Using ONLY arithmetics  to find, sort out by exclusion the primes. Simply and effective, too.  See the prime multiples (PRM’s) tables below for examples of how to create those PrM’s, quickly.


.

..
The below Prime Multiples shows us how to do this, using a new method, which is efficient, basic and can form the basis of much more mathematical understanding of primes and how to find them, without limits.
.
.
Efficient methods are least energy methods. and ridding the number line of nearly 75% of the candidates for primes is efficient. As per the starting paragraphs, to show the way this can be created.
.
.
So, in order to solve the prime number sorting problem, we must go to the basics. What is a prime number? It’s defined by the multiplication of 1 times a prime, which is the ONLY multiple a prime can have. It cannot be a multiple of any other number. Then we have the concept of the Prime Multiple. Which means the number is NOT prime but is composed of that set of primes multiplied against each other, the least of which are two primes. All of the other prime multiples are of the type of Prime times Prime, times Prime, etc. But must be at LEAST two prime multiples. Finding a single PrM does necessarily excludes the number from being prime, and we do NOT have to know any of the other multiples, either. That short cuts (efficiency) the problem, very quickly.
.
.
No longer is it necessary to divide all numbers of the number line by at least 1/2 of that number and below, to find the primes. & it’s an Immense savings of a LOT of computing and time. Finding primes can be found using this mehtod very easily, if well organizing by only a calculator and limited by the number of digits that calculator can hold.  If the device can hold 13 digits, the primes can be generated up to the numbers in the trillions, and sequentially, and more besides.
.
.
 So if we generate PrM’s , then we know none of them are prime.
.
.
Therefore PRIMES cannot be generated by any mathematical process in arithmetic, but are in fact, PRM exclusions and eliminations. A method of sorting, IOW.
.
.
So by finding a prime multiple  greater than 2 and even, cannot be prime. Thus only the number line can create primes when it is generated. and ONLY by exclusion, elimination of the Prime Multiples  (PrM’s) can we find, sort, and detect the Primes. That basic, fundamental rule creates an efficient method, using only simple arithmetic, to generate all of the prime numbers, and exclude by a simple Comparison Process, using only what is prime.
.
.
Thus we create this vast simplification of exclusion, elimination. Starting with the simple and moving up the number line to the complex.
.
.
But what about 3? We can exclude that because it’s known that except for 1X3, which defines that prime, NO other number with a 3 as PrM can be prime. Thus the casting out of 3’s method. Sum up any sequences of number, such as 123, or 561, or etc. If the total is a number divisible by THREE, then that must be a prime multiplier of 3, and cannot be prime. This sorts out 1/3 of the number line and empirically 14/40 of the 1, 3, 7, 9, number ending quartets. Thus the 75% sorter.
.
.
Empirically by sorting thru to 2000, we see that the numbers of primes can vary per centad, often from about 12, to as many as 18, or so. And with the higher #’s opf primes found, means the process is ever more efficient than 45% in those cases.
.
.
and there is a very simple rule which can cast out 3’s which works better and better the larger then number of digits in the quartets. Another of unlimited efficiency rules which can be generated to speed up prime sorting out processes.
.
.
So we can generate THESE decades, as we’ve largely excluded most all 3 multiples and the other five. It’s down from 10 numbers/decade to only 4. Vast simplification, as shown in the first paragraphs of this article. And when those 4 are also reduced by casting out 3 & exclusion, what’s left is about 25 numbers which are possible primes.
.
.
Then we extend this method even further by the PrM Series starting with 3 as a demo, and simplification and then 7, 11, 13, etc., multiples of the other primes in sequence. That creates, generates a PrM & not a prime number, necessarily. And by extending the system using a lot of 3X3, 3X5, 3X 7, 3X9, 3X121, we see that NO number of the 3X higher primes in sequences CAN be prime. Thus we are left with 1, 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, etc. as a prime series in the 10 decades of the first centad. By Exclusion/elimination of the PrM’s which cannot be primes.
.
.
This simple sorting, exclusion, elimination method  cannot generate the primes, but excludes  PrM’s  and leaves primes by a complementary process. Primes Cannot be generated, but they can be found by the PrM exclusion process which generates all PrM’s!!!!
.
.
The pattern of the primes is ONLY complementary to the complex interference patterns of the PrM’s. That’s the pattern which has been seen. & within which many other efficiency related patterns can be seen to speed up calculation. Without Limits!!!
.
.
Pritchard used a method of 6, 2, 4 2, 6, 4, which only worked for a while, and which was ALSO generated by the series of PrM’s. And alswyas had to generate those. And did not r/o primes by casting out 3’s or any nubmer ending in 0, or 5. Thus there was a LOT more work to do.
.
.
Therefore we have, if used accurately with good accounting, found a simple way to find the primes, by creating and ID’g All of the PrM’s in set sequnces, which necessarily gives us the primes.
.
.
There is a simple pattern to creating those PrM series, which will now be shown. The pattern is IN the PrM’s which then creates a complementary pattern of primes. That’s the only pattern which can be seen as to how it arises. It will be shown below. There is NO prime sequence pattern.
.
.
Part 2: The Prime sorting method using PrM tables.
.
Below is the general arithmetic, simple method for creating the prime multiples (PrM’s, AKA composite numbers). This leaves out the numbers in the number line which are prime. Thus because NO prime can be generated, as generating numbers are created by arithmetic processes,. The method uses exclusion by  PrM’s.
.
.
 Note we do NOT have to fully factor any suspected prime, but we only have to show that it’s a 2 number, PrM. This greatly simplifies the process.
.
.
Understanding this is easy. Any even number but for 2, (1X2), cannot be prime. That repeating of Four numbers is found in every decade of 1-10, 11-20, or 100-110, 111-120, etc. without limit. The end number of all prime numbers is always 1, 3, 7 & 9. That massively simplifies the sorting process to find the primes.  It’s an empirical form of mathematics, NOT logical.
.
.
Also note that this table is relevant to understanding that quartet of end numbers:
 end 1 can be generated by 1X 1; 3X 7; and 9X 9.
end 3 can only be created by 1X 3; and 7X 9.
end 7 by 1X 7; 3X 9.
& end 9, by 1X 9 and 3X 3, and 7X7;  such as 3X 13, or 23X 83 prime multiples.
.
.
That vastly reduces the choices. And the above prime times prime pair of squares and cubes, etc., multiple prime creators finishes the job.
.
.
also necessary to factor out every prime numbe squared. As those begin the process, it’s necessary to eliminate those when the number is p exp. x, >/= 2. For instance we know in the series 2400 to eliminate 343 & 2401, etc. as those are 7exp. 2 or greater.
.
For example, we take the 3rd centad of the first 100 numbers.
31, 33, 37, 39  casting out 3’s in this series leaves only 31 and 37.
41, 43, 47, 49** casting out 3’s leave only 41, 43, 47 &49. The last is a prime multiple of 7X 7, which is a PrM, not prime. Those will be treated in detail at the end.
.
.
These four end digits lists can be extended without limit to any number of digits, whether 3, 4, or 4000 or 4 Billions of digits.
.
.
The last is more laborious, but using the above series of 3X any prime in series, and 7X  of each prime in series, and 11, 13, 17, 19, & so forth. THAT creates the multi primes which are end numbers of 1, 3, 7, and 9. And finishes the exacting job of excluding PrM’s, and leaves the numbers on the number line which remain that are primes.
.
.
It’s not known if there is a casting out 7 method which works as well as casting out sums of digits divisible by 3 or not, but is being worked on. & likely can be found, altho this will only save about 5-6 numbers int he 40 number in each centad, but is a good extra example of the multiplicities of finding NEW, unexpected simplifying methods drive by brain pattern recognition.
.
].
Thus we do NOT generate primes, and cannot. BUT, we can use the primes to generate PrM (prime multiples), which ARE not primes. and those can be All generated by the simple arithmetic series/ progression we see at first. Thus if the calculator has 9-12 digits we can calculate primes into the billions, but NOT billions of digits. That requires a computer.
.
.
That is the key to sorting out primes. And as the primes are then sorted out 3-4 or more times faster than the PrM series can be generated, it never runs out of primes to test by the PrM method. For instance, the numbers of primes sorted out is just a fraction the number between each prime, such as 3, 7, 11, 13, 17 & 19 times 100. Thus the PrM generator is unlimited.
.
.
This gives us 400 primes sorted out between the series of 7 and 111; and between 11 and 143, about 200 numbers sorted; and between 13, and 17, another 400+ numbers sorted. Thus a single pair of primes can create dozens of primes, and then the primes create the next series of PrM’s to be created. It’s self sustaining. .With some Positive feedback and the rising size of the factorials creates a very rapid growth of numbers of PrM created by this means, too.
.
.
Here are the exemplary PrM series tables for each prime in the -1, -3, -7, -9 end number series.
.
.
Three series
3X 1
3X  3 = 9
3X 5  = 15
Here is the start of the -1, -3, -7, -9 series
3X 7 = 21 cannot be prime
3X 9 = 27
3X11 = 33
3X 13 = 39 etc., and all sequences of the Three multiple prime series are NOT primes. This is the empirical legitimacy of the casting out of 3’s method.
.
.
The next series starts the entire complicated system of creating by a simple subtraction, multiplication and adding, a general universal method not requiring ANY other math, to obtain the entire PrM listing, except for the Prime squares and their PrM’s, which will be shown to be simple at the end. As those are only about 3% of all of the numbers to be sieved, it completes the 97% of PrM created, which create by exclusion, sorting out of the primes.
.
.
The simple mathematically clean and arithmetic progression series can be see. We do NOT need any longer to multiple ANY of the PrM’;s together, but uses a subtraction of the two primes in succession, a multiplication of the Prime series number in this case, 7, and  an addition to the previous PrM pair. And by casting out 9’s we can simply check if the multiplication is correct, or the addition is, too. These show the een number patterns of the Prime Multiples, which laid the basis for Pritchard’s work. altho he did NOT use the 1, 3, 7, 9, ending quartets simplification, either.
.
.
7X 7 =     49 Not prime
   7X  4=+28
7X 11 =   77
   7X2 =   14
7X 13 =   91
   7X4 =   28
7X 17 =  119
.
.
Here we have done the 7 Series through 100.
.
.
Now the next PrM series of 11
.
.
11X 11 =  121
  11X 2     +22
11X 13 =   143
    11X4     +44
11X 17 =  187
   11X2     +22
11X 19 =  201
11X 23
11X 29
11X 31
11X 37, etc.,
.
.
Followed by 17X17, 19X19 and so forth covering each decad and each centad which is needed to be covered.
.
.
Thus we can take say 2400 to 2500 and by finding the prime multiples from 2401 upwards to 2499, compute ALL the PrM’s for that centad and sort out the 2400 to 2500 Primes.
.
.
With a large enough computer with enough registers and memory, we can compute ANY number of digits from say 1,000,500 to 1,000,600, of from 3,000,100,000 to 3,000,200,000.
.
.
And we can generate PM’s at will over ANY range, once those primes have been found.
.
.
Thus from ANY prime by a simple sorting process of the PrM’s which cover that range, we can create ANY listing of primes wherever we choose, as long as we can multiply correctly and do good accounting with the PrM’s which are simple and the Prime Squares, cubes, etc., too.
.
.
There is thus NO limit to the number of digits we can process. Within polynomial time.
.
.
Say a banking password uses 750-760 long prime digits. Those can be specifically created for a complete list of primes and be tested, easily and quickly using this method.
.
.
And if the bank goes up to 2500 digits, and the computer is a RISC which can do these tasks specifically, then the numbers of primes can be found there, as well, and tested. Just get the password code in the chip for the primes, and go from there.
.
.
& the banks can also create vast lists of primes, without doing ALL of the primes calculations but limiting those to the intervals of primes to be found.  The caveat is that the Primes which are prime multiples must be computed, too. But that’s doable. Only need a password chip or set of coded primes, to find out what they are.
.
.
As the computers get more powerful and the registers get larger & larger, there is NO prime series which cannot be arithmetically done in polynomal time which cannot be found. Thus ALL primes within the limits of computational,l power, CAN be found. Without limits, too.
.
.
Using his mthods, Pritchard found 22 sequential primes in the range of several millions of digits. And this method is even faster
.
As each prime creates more PrM’s than the PrM exclude more PrM sorts more primes than it needs to create them. Thus we create the primes by using the PrM which are a self perpetuation series. This can be programmed, and solve the entire series of primes, too, within the limits desired. Simple, not complicated math other than a single subtraction, multiplying and addition, to sort out from the number line, by exclusion of PrM’s, of ALL the prime numbers in that target centad. Using a well designed RISC, to 1, list the number line, then establish the interval and the PrM’s which cover that interval, the entire prime numbers can be efficiently created, too.  This means every single number, within the limits of time and processing speed, no matter how long, can be found Within polynomial time to be Either PrM, Or Prime.
.
.
Therefore, with a good enough computer it’s possible to VERY easily sort out all of the primes in polynomial time. The only limit is the power of the computer itself. And by using these methods, and clearing the memory stacks at intervals, and reprogramming the primes by the computer itself, the next series can be generated very quickly.
.
.
 And the only limits are the powers of the best RISC’s which can be made to sort out the primes by the above simply system. It simply uses the known primes and then goes efficiently from there.
.
.
And it can create a series of primes starting at ANY spot, using the Known primes between those numbers too. It’s only a matter of sorting by arithmetic, now. Nothing more needs to be done. The numbers of squares are matters of using a good CRC, and using carefully chosen prime series, which will give all numbers between same 1,000,000, with the first PrM series, will give any prime series after that, as well. all that’s need are the data, to compute which PrM lie between say, 1,000,000 and 1,000,500. Then the rest of the prime lists both above and below can be computed at will, too. This can then generate LONG lists of any set of primes at any spot we wish to start at, once the computer has done the Prime sorting work to within the range of generating the PrM’s which lie around the Prime targets desired. In a practical sense ALL of the primes into the millions of digits are already known. & that allowed ever more primes to be found, from the prime lists which are a matter of public record, already.
.
.
& that’s how it works. The technical delivery of the Primes is of course a lot of work, but then when most any prime number can be selected out and found, and then an entire list of primes can be generated to extend that list without limit, no prime number can escape from being found, regardless of how long it is. This is an encryption nightmare, which means as the computers become faster, more efficient by using better and more efficient methods without limit, that very likely, the race between prey and predator can be seen to extend to this. The better and better sorting methods which can be based on this method are unlimited as int he number line.
.
.
And the time to find the primes is directly related to the power of the computer, regardless of digits. This method can do that. It’s just a matter of having a good accounting programmer.
.
Further, within each Pr M series, there is an even larger pattern, than these & by finding those patterns, more efficiencies of computing the PrM’s can be found without limits, too.
.
.
.
Thus we have shortened the time it takes to prove and find a prime, Empirically, substantially. & can further test the method for any possible flaws, OR correct mistakes found in the prime number list, which are likely to be substantial, the larger the digit lengths are. Which is experimental, empirical math at its best!!!
.
And byh comparing the tables, we can immensely shorten the calculation times, because we have, like in the cases of sine, and cubes, and logs, we also have done the work. and the LTM efficiency of such tables, is ever more the case because it cuts down the sorting times, substantially. Thus the efficiency, robustness of the method and its sheer elegance, as most all new models must show. And fruitful without limit, besides, the last profoundly important element of any good model/theory, method. and that means, that LTM and archives of info are efficient, Least Energy methods, too. Which is why we use them to gain the LE saving, benefits & profits, above all!!!
.
.
There is much more, but this shows how the method of creating Pr M quickly & robustly sorts & ID’s primes by simple exclusion, and when computer size properly and efficiently,can sort out primes to 10 mega digits in only minutes.
.
This in short is the Wiggins Prime Multiples Sieve Method (my thanks to Paul Pritchard who proved an earlier form of what Eratosthenes created, first.) which robustly, empirically solves the order in which Primes exist, by showing the Not prime, Prime Multiples which sieves out efficiently, by complementary pattern of the NOT Prime multiples; thus sorting out which are primes. & Exactly.
.
.
It’s the Surprise, unexpected use of the Primes to sort out and find the Primes, which is the key here. And as it’s an input, output system, can grow in power by a factor of 2, 4, 6 or more, the more work is done to create the PrM series, the vaster its power. and applicability.
As it’s unlimited, so the ability to solve the encryption problems with this method is unlimited. & nothing by this primes method can resist being broken, even by brute force. And because for each stage, new, more efficient sorting methods can be created, Without Limit, which increases the efficiencies of this method unlimited, then nothing is safe.
Primes are not therefore encryption proof but only so far as the computers cannot improve enough to brute force solve and find the primes.
.
.
But it’s even worse than that. . As sorting is done best by QC, and calculation done best by chip computers, by combining the calculations of the one, by sorting using the QC, and very fast calculation & sroting can be done to find by sorting, NOT trial and error,m the real primes., Just don’t make a mistake and from time to time, check the multiplication results to make sure they fit.
.
.
And moreover, there is yet another implication of this systems, and that is the very easy generating of MORE primes, once a prime succession has been found.
.
.
And then similarly process the rest, and once we have the primes in those ranges, we can calculate out the rest of those numbers say between 2000 and 3000, thus sorting a new list of primes, where before we had a fewer, too.
.
.
And this is hows it’s done and proven empirically and tested to be so.
.
.
Thus we can in polynomial times, with powerful enough computers, break any 12-13 digits numbers or lower or higher, if we can get enough computer power and powerful enough algorithms, which can be generated without limits, by seeing the patterns without limits, and short cutting the problems of factorization. We can find a comparison process prime system of 2 adjacent Numbers of any primes within the range of the given digits size, which we can easily determine if prime  or not, and go from there.
.
.
IOW, we can create  two lines of a single prime series and then generate all of the primes within a set limit such as 2000 to 3000, or 5 billions to 5 billions 1000., 5,000,000,1000. or any such limits, too.
.
.
it’s simply a matter of generating all the prime multiples &  sorting to find the primes, and using the prime numbers thus found, to generate the “not prime” Prime multiples.
.
.
QED.
.
.
  Keep It Simple!!!