top of page

The Common Tread – Artificial Intelligence – What it is and How We Got Here

We imagine that at the dawn of man there were at first individuals who focused only on shelter, survival, and reproduction with little time for anything else. Over time, someone started to think about their existence. Who am I? Where did I come from? Did somebody make me? With the passage of time, someone discussed these ideas with another person and so on. This eventually resulted in the concepts of gain and loss and their relation to one’s behavior. This led to the idea of self; I am born, I live, and I die. Somewhere along this path, the perception of reality emerged. Humans began to understand that survival and the future could only be successful if it were based on reality.

 

What is reality?

The actual state of everything, independent of any human opinion.

 

What is truth?

The cognitive interpretation of reality based on empirical facts.

 

We build truth on the foundation of perceived reality but with the perspective of individual interest. Is it science, philosophy, psychology, religion, etc.? We align our personal beliefs with that underlying perspective. That is the contaminant of our human imperfection. Physical science is unique; truth is based solely on empirical data. There is no room for opinion, theory, or logic.

 

All groups – societies – develop ways of how to perceive reality and how to test it by gauging the truth of a given reality. This depends ultimately on human reason, with its power and its limitations. Nietzsche warned us that convictions were a greater enemy of truth than lies. Therefore, the gauging of truth is a continuous challenge, and the techniques become more sophisticated with time. We depend on this for improving the quality of our lives, our survival, and our ambitions and pleasures, be it good or bad. That is the secular side of human reason. Human reason is based on our relationship to reality, which is an aggregate of knowledge as interpreted and considered by the human brain with its limitations. Over time, humans detected and then identified an expanding array of characteristics that made up everything that they thought mattered. This led to attempts to understand reality in both logical and, most important, reproducible terms. This was and is learning followed by verification. At first, the scope was limited to the horizon, then in time, over the horizon, but eventually to the cosmos and its ultimate barrier, infinity.  At each step, the search for truth was based on the human brain’s ability to process facts – data – but eventually the enormous amount of information was overwhelming. Thus, invention was called upon to create a means that would enhance the brain’s limitations - artificial intelligence. AI can process vast sums of data in a brief time window and can learn,  solve, and verify and reveal trends, patterns, order, and disorder all unrecognizable by the human brain.

 

This document is a review of how we got here – the present state of artificial intelligence and where it might go. The exploration of reality and the nature of truth has progressed through a multitude of various experiences and challenges throughout human history. Man has existed as Homo sapiens for about a quarter million years.

 

Around four to five thousand years ago, the Sumerians - who first emerged circa 5400 BC - engaged in complex interactions with neighboring societies, including conflict, assimilation, and cultural fusion. Over time, through suppression and/or merging of groups such as the Akkadians, Amorites, and Assyrians, culminated in the rise of the Babylonian Empire. This occurred around 1700 BC eventually controlling much of what is known as the Fertile Crescent or Mesopotamia but not including Egypt.

 

Mesopotamia is a crescent-shaped geographic region of the Middle East, containing the moist and fertile land of Western Asia, and the Nile Valley and Nile Delta of northeast Africa. Mesopotamia comprised what is known today as Iraq, Palestine (Israel and the West Bank), Egypt, Syria, Jorden, Lebanon, and parts of Persia, Saudi Arabia, and Turkey..

 

They, along with the Jewish Kingdoms of ancient Israel and Judea may have pioneered rational inquiry. They invented many things, but so many of their accomplishments only became known to us through the archaeological work of the 19th and 20th centuries. There was no documentation as with the Greeks.

 

Did the Greeks know of the Sumerian/Babylonian/Jewish  developments, discoveries and technology? Yes and no, it was complicated. It was indirect, scattered, piecemeal, and affected by cultural biases. It resulted from trade (particularly trade route activity), war, and some intellectual exchange. But the Greeks over time absorbed enough significant information from the past along with their own revelations that coalesced into the Greek academies of philosophy, science, astrology, divination, and mathematics. They identified schools of thought and recorded them in documentation and along with reliable witnesses that survived the punishments and wear of time and fate.


Greece was first, followed by Rome. In Greece, Plato, Aristotle, Socrates, Epicurus, Thales, and Pythagoras stand out. Fortunately, much of their work remains. The Greek philosophic and scientific accomplishments were built on many earlier inventions and findings, but were profound in their improvements and extensiveness. That set the format for three schools of western philosophic systems that continue and are alive and well today:

 

Stoicism emphasizes  fate, virtue, reason, and personal restraint, living in harmony with nature.

 

Epicureanism is the pursuit  of pleasure and the avoidance of pain as the highest good, emphasizing a simple and tranquil life.

 

Skepticism stresses doubt and questions the certainty of knowledge.

 

The Greek experience was a recognition that humans could understand the nature of themselves and the world they lived in. By learning, they created the elements of a very sophisticated civilization. But as they pursued these questions, there were many mysteries that bemused and confused them because they defied explanation. Their only rationalization was in turning to the supernatural domain. That left them to conclude it was the work of gods or, if negative, the devil or some other negative entity. The development of Greek philosophers formed the foundation for the Romans who followed suit; Marcus Aurelius, Seneca, Cicero, Epictetus, and Lucretius. These individuals made significant contributions to the pursuit of truth. They and the Greeks also continue to be relevant to the present, but there were others who followed a different path in such countries as China, India,  Japan, Persia, and others in the eastern region.

 

With some caution, one can state that the essential difference between western and eastern analytical thought is that the western system is pragmatic, and in the east it is more spiritual. Unlike the East, which relies more on individual interpretation, the West uses logic and reasoning. The West recognizes a reality and then pursues an academic  path to that truth by confirming that reality by obtaining data (facts) from verifiable sources along with the ability to repeat outcomes. While accepting presented data, the Eastern approach is less concerned with verification. The East emphasizes maintaining equilibrium with its sources, while the West prioritizes the individual. The East emphasizes society and wisdom and does not embrace the required academic practices of the West. Also, the East mixes philosophy with religion and ethics. Eastern philosophy encompasses a diverse range of philosophical traditions.

 

One common portrait of the difference between the Eastern and Western traditions posits a radical incommensurability in the very nature of philosophical inquiry. Chinese philosophy is “wisdom” literature, composed primarily of stories and sayings designed to move the audience to adopt a way of life or to confirm its adoption of that way of life. Western philosophy is systematic argumentation and theory.

 

“Eastern philosophy encompasses metaphysics, epistemology (study of nature, origins, and limits of human knowledge), and religion, despite the lack of a clear demarcation between the multiple categories.”[1]

 

In the late nineteenth century, the East began to move towards Western philosophical methods, particularly as this applied to the Japanese. This also resulted in Western adoption of some Eastern philosophical principles, especially in the twentieth century. This created hybrid forms of Western philosophy that were a response to modernity. They have affected such fields as psychology, ethics, and social justice.

 

Why was there no meaningful dialogue between the two academies over the many centuries of their development? Apparently, for the most part it was physical – distance, mountains, deserts, seas, and eastern forced isolation as a solution for protection – walls, fortresses, and cultural restraints. There were some exceptions. Persia, in particular, had advanced scientific work translated into Latin, influencing many in the West, such as the work of Copernicus. The conflict between Christianity and the Muslim world was a constant barrier. It is surprising for a country like China, with its old, advanced technology, that the logic and reasoning system of the West did not find a similar development in their culture. China had invented paper, gunpowder, the magnetic compass, porcelain, silk, pasta, celestial navigation instrumentation, and many other contributions. There must have been much more, but were lost in the erosions of time, and/or the barriers of cultural prejudice.

 

For both the east and west, there were so many questions that even a supernatural deity could not answer. Why day and night?  Why are there seasons? What causes the oceans to rise and descend  several times each day? There was no way to explain them. These were mysteries that could not satisfy the logical mind. Reason required a means for understanding the phenomena. The reality behind  these questions was not the problem, but rather the right question that could apply the elements of reason.

 

Therefore, how could reason be applied to solve the riddles? Fantasy and superstition prevailed but did not mitigate the academic frustration. What method could reveal the truth? The Roman Empire’s decline into insignificance culminated in its demise during the fifth century. The great monotheistic religions grew in their influence and became the arbiters of what was to be accepted as reality. Explanations for these questions became integrated into religious dogma as naturally occurring events of divine origin. To oppose them was sinful, the work of the devil or some other equivalent. To resist or counter was punishable by violence and/or more influential, the refusal to be admitted upon death to a promised reward such as heaven. The religious institutions dominated almost all academic pursuits for centuries in Europe and the Middle East. To further compound the problem in Europe, the language of the church was Latin. The folly of this was that almost nobody attending understood Latin, including most of the priests. In the Far East, the distance and unfriendly terrain prevented contact with Europe. Without contact, they could not take part in a shared dialogue. The Dark Ages had arrived. Academia had to wait until the fourteenth and fifteenth centuries before the darkness began to lift. But there were some exceptions, in particular from the Muslim world. 

 

The first university in the world was the University of al-Qamariyyah in Fez, Morocco. It was founded by Fatima al-Fihri, a Muslim woman from Tunisia. The institution first emerged as a mosque in 859, but later grew into the al-Qamariyyah Mosque and University. It is still operating in the 21st century.[2]

 

The next of the still operating oldest Universities is in Bologna, Italy. The University of Bologna began from a charter granted by the Holy Roman Emperor Frederick I Barbarossa in 1158, but evidence exists it that actually started in 1088.[3] 

 

Unfortunately, the Ottoman Empire conquered Constantinople in 1453, and the suppression continued. They pronounced the definition of reality as an amalgamated doctrine uniting faith, reason, and religion. This not only defined reason, but controlled the politics of nations.

 

At the end of the 13th century, a transformation started in Florence, Italy, then expanded to all of Italy and eventually involved most of Europe. This was a move away from the dark clouds of ignorance and repressive control of the Middle Ages. It began with a trickle that blossomed into the Renaissance – a rebirth of independent human expression. It involved art, literature, science, architecture, philosophy, and the rediscovery of the Roman, Greek, and Muslim intellectual accomplishments. Dante and Giotto were early patrons of the beginning. It began at the end of the 13th century included the 14th, 15th, and 16TH centuries and ended at the beginning of the 17th century. This brought forth the Renaissance men – Leonardo da Vinci, Lorenzo de’ Medici, Nicolaus Copernicus, Petrarch, Raphael, Galileo Galilei, Michel de Montaigne, Niccolo Machiavelli, Shakespeare, Chaucer, Boccaccio, and so many more.


The Renaissance was very localized as compared to movements such the Reformation, the Industrial Revolution, etc. because it developed in the residue of the Middle Ages, where the many tools of repression dominated so much of life and continued to exist. From the autocracies to the theocracies, the ideas of change and rebirth meant breaking the walls of a thousand years of repression diluting the powers of control. The era produced an amazing amount of change. With this change came the rediscovery of Humanism.


Humanism, a belief that is based on the reality of human thought as the prime realities of life as opposed to the divine or supernatural interpretations. It saw the goodness in humans and the ability to use rational ways to solve a person or society's problems.

Examining one of the key drivers of the Renaissance was art. Why art? Art is one’s opinion as expressed through a painting or sculpture or an equivalent medium. It isn’t a written statement or the result of dialog. It is subject to what the viewer sees and the viewer's interpretation. The artist's opinion is the result of lines, colors, subjects, dimensions, that can be subtle, distorted, and/or hidden, creating levels of safety from the dangerous genre of middle age controls. Still, many of the works remained religious, perhaps the choice was for protection.

 

Looking at some of the early artists reveals modest changes that as the Renaissance matured the work became more vivid and harsher and less hidden. The idea of presenting truth about a reality used the power of vision that a painting or a sculpture was so capable of generating a human response. The artist had no empirical tool to prove the point, but it did not weaken the truth. Artists of the renaissance created human feelings in their work; joy, sorrow, passion, pain, boredom, remorse, agony, puzzlement, the full spectrum of human emotion. Representatives such as the sculptor Donatello or the paintings of Masaccio  in the fifteenth century, we are presented with subdued color,  dissonance, unhappiness, and devotion. Look at Masaccio's many group paintings, there is never contentment  or even a smile, never!  In Donatello’s sculpture,  we see the pain of reality as opposed to the traditional human indifference in the expressions of the preceding art of the Gothic and Roman styles of antiquity that would no longer dominate. The artists were telling us how life really was.  Jesus was shown in the pain of his existence as contrasted to an idealized flat fantasy of the middle age. Expressing reality would become the driver of the renaissance expression, and the styles were highly individualistic. The idea of three-dimensionality as opposed to the traditional two-dimensional art became a prime driver. The artists often were from those who were educated or studied philosophy, mathematics, science, architecture, and the sciences. Traditional art was mainly about religion. The Renaissance brought in events, people of all sorts, scenes of work, the stuff of life and death. This allowed a style that could use dimensional disproportion and exaggeration and/or a manipulated environ to speak of a truth without a required entity such as an institution to support it. The viewer, only the viewer, was the subject of influence. The churches or monarchies had been the main patronage of artists decorating the walls and halls of worship. The Renaissance brought wealthy families into the patronage domain, decorating their homes and castles with their portraits, accomplishments, their history, and religious themes.  From this patronage base, new architectural themes were accepted and emerged and resulted in how buildings were designed and constructed.


As the centuries tic’d on, the period of the 15th and early 16th century became known as the High Renaissance. During this period artists developed incredible technical skills with color, light, geometry, texture, materials, perspective, and architecture that exceeded anything from the past. They mastered the natural appearance of world and the persons that were their subjects. They studied and were influenced by their predecessors of Rome and Greece which for the most part had been ignored for a thousand years. It even led to improvement in education opening the classroom to debate, unheard of in the middle ages. Here we see defining the reality based on the academic methodology of empirical logic -  facts, provable facts, verifiable facts.  Literature began to progress. Traditional Latin and  Greek was no longer dominant. New paths opened providing epochs and romance and the devil in new and easier structures. The language of the “common people” – the vernacular - became the dominant  languages of literature. The High Renaissance gave us the works Shakespeare, Marlow, Chauser, Boccaccio, and others that remain read and staged, et al, in the presence. There were other improvements in society such as banking and accounting. The power of the works of this period were so profound they continue to affect the 21st century. They do not fade. This is also when the  great polymaths lived - Leonardo da Vinci, Galileo Galilei, Copernicus and  Michaelangelo and others breaking the barriers of the past.  

 

As the Renaissance was moving on technology progressed. In the 15th Century something big occurred. In 1455, Johannes Gutenberg invented the printing press, which opened a door to profound change. unrestricted logic, the free use of reason, and a safer path to truth. Suddenly, it became possible to deploy information inexpensively to the masses in languages they could understand. In an amazingly short period of time, books of every sort became available for individuals to consider ideas of everything. 

 

This was a counter to religious or other kinds of institutions holding sway over reason. Soon the idea of the human intellect could be responsible for defining reality, as well as new and independent ideas of Devine limitations. With the Reformation, these ideas were reinforced.  Religion as the arbiter of everything began to fade. 

 

As the breakthrough technical transition of printing occurred, there were centers of intellectual activity that could no longer be suppressed by the religious and authoritarian establishments. The suppression had been both local and nationwide, reaching the highest centers of power such as the Catholic Church , the Hapsburgs, all the way to local governments at the lowest level. To violate the distribution of new, free, or restricted ideas and thoughts verbally or otherwise was dangerous. Punishments ranged from fines, loss of property, banishment, excommunication, confinement, prison, torture, and even death. London, Paris, Amsterdam, and even Rome became centers of publications of all kinds of ideas and themes and news that had been suppressed for ages. London and Amsterdam emerged as hubs for free-thinking publications because they did not restrict what could be published.  They were both centers of economic prosperity and political activity. Both had a more protestant history that was less restrictive and therefore were more open to innovation. Further, publication became a powerful tool for the Reformation. There was a convergence of economic, political, and cultural factors that made these cities unusually receptive to intellectual innovation. Both were major commercial centers, attracting merchants, scholars, and artisans. This more sophisticated genre provided a demand for printed materials that could inform and entertain. The printing press wasn’t just a tool; it became a big industry employing many people.  Print shops became centers of intellectual dialog, becoming as important to public opinion as universities and the legal domain such as courts. Illiteracy was reversing from centuries of ignorance because there was a market for news, information, and entertainment that could be afforded by the lower income classes for the first time. Controversy was permitted within limitations for the first time ever. One of the first major selling books was Don Quixote: Man of la Mancha by Cervantes. It still is in print having exceeded five hundred million copies. Amsterdam was the most tolerant. It attracted angry dissenters, reformers, and freethinkers who were not welcomed elsewhere.

 

The age of exploration revealed a plethora of societies with many definitions of reality and their practices of truth. This caused the religious Western concept of reality to crash into the Western intellect, separating it into bits and pieces of intellectual diversity, which to this day remains a burden and an impediment to a unified world humanism. But these revelations expanded the Western intellect in all its iterations. 

 

Then the age of Enlightenment arrived with its intellectual giants: Descartes, Newton, Leibniz, van Leeuwenhoek, Montesquieu, Spinoza, Kant,  Huygens, Hegel, Kierkegaard, and others who changed the styles and judgements of reason and opened fresh paths to truth.

 

The Scientific Age, which was the offspring of the Enlightenment, matured in the early 1700s. It accomplishments were many, including mathematics, astronomy, and the natural sciences such as physics, chemistry, biology, etc. This became the foundation for  the second phase of the Scientific Age,  the so-called Modern Science Age. With it was Kant’s warning that man should recognize the limitations of his mind. Scientists did not disregard this as experimental evidence began to uncover surprising aspects of reality such as space-time relationships and quantum physics. Revealing that intuitive reasoning was the enemy of this new reality. This was and remains difficult for the human mindset.

 

The scientific revolution began to replace the classic Greek perception of the natural world.  Science emerged as a separate discipline and not a part of philosophy and its companion, technology.  Science sought ‘how’ as opposed to the Greek ‘why.’ The influx of so much new information stretched the capacity of the existing institutions.  How to deploy and verify this information required new and efficient techniques. Publishing a book for every new idea  became overwhelming and  too expensive for every subject of curiosity. Scientific papers, experimental results, and demonstrations became the practical solution. This required critical as well as an independent means of verification. Scientists formed organizations dedicated to specific disciplines.  In Europe, national scientific societies were created and backed by their governments. These became centers for discussions and debates for the old and new ideas and theories. By the end of the nineteenth century, thanks to the principle of the conservation of energy and the second law of thermodynamics (heat always flows spontaneously from hotter to colder regions of matter), the physical world appeared to be completely comprehensible in terms of complex but precise mathematical forms describing various mechanical transformations. The path to truth was about to be far more complex as the twentieth century arrived with its uncomfortable proclivities.

 

Beginning in the early seventeen hundreds, new persons started work that became pivotal for the next two centuries; Euler, Diderot, Laplace, Lavoisier, Joule, de Coulomb, von Helmholtz, Fresnel, Kelvin, Fresnel, Maxwell, de Lamarck, and Darwin.

 

As all these academic contributions were occurring, the Industrial Revolution was also occurring (1750-1900). It was a fundamental process of change from the farm, with its manual labor being replaced by machines with their speed, efficiency, and expanded productivity. In the commercial goods market, traditional manufacturing as a process of handicraft labor was replaced by machines, eliminating the traditional handiwork of human hands. The increased variety and lower costs to the buying public were irresistible. The marketplace was redefined and swelled. It is generally agreed that the Industrial Revolution started in England and spread throughout western Europe and into parts of the British Commonwealth and the United States. Eastern Europe’s transition was slow and spotty because of political problems, but eventually, it transitioned. The communist system had and has particular problems with the agrarian sector confronting the individual response to traditional farming culture. Marx and Engels were wrong about farmers. The communist countries had to adapt hybrid forms of capitalism to solve the farming troubles. It took until the late nineteenth century to reach China, Japan, and India. These delays were based on the issue of economic transformation, in other words, the effects of a country’s culture on extreme changes and its residue, politics.

 

The Industrial Revolution went hand in hand with all previous academic accomplishments. The most significant change was the Industrial Revolution’s alteration of energy distribution. Humanity has used machines since the earliest recorded history. Man has employed such devices as water wheels, windmills, crowbars, block and tackle pulleys, et al., throughout the world. The key tipping point was and remains, energy distribution. The never-ending theme of energy distribution has always been centralization, moving on to decentralization. 

 

Examine what occurred to a common machine, the water wheel. The use of water-driven wheels may date back to Sumerian times in Mesopotamia – 4,000 to 1,000BC. In the 18th century, water wheels were commonly used to mill flour and a host of similar foodstuffs, a process referred to as grist milling. The main shaft of the water wheel drove, either directly or through a geared assembly, the grindstone face, which was mounted extremely close to the face of a stationary stone surface. The grains were fed into the tiny space between the two stones, forcing the grains to be crushed – ground – into the desired powder form we refer to as flour. Somewhere in time, it occurred to one of the gristmill operators that if the main shaft of the water wheel could be extended, so that it could drive more than one grindstone opposing another stationery stone surface, production could be increased from the same water wheel. This could provide doubling, tripling, or more production for the operator using the one water wheel. Two fundamental things occurred that had never occurred before. One, energy was being distributed from one common source to more than one machine at the same time, and two, decentralization – although primitive – was occurring because the operator had found a way to distribute energy from the one water wheel to a second grist mill or more - more than one machine from the same energy source. The benefit to the operator was economic. As the revolution advanced, reciprocating machines were developed. The first was steam-driven devices that burned coal or wood to heat water into steam. The steam drove pistons that drove a shaft that could turn a grinding wheel or a multitude of applications, such as a wood sawmill blade. With the advent of steam-driven machines, gristmills and other kinds of machines no longer had to be near running water or some other natural energy source. The machines could be located at more advantageous locations, providing better economics for both the user and the operator. This significantly improved productivity and is a classic example of true decentralization. As the design and understanding of reciprocating steam engines improved, they became more powerful and lighter and therefore could be transported to locations where they were needed for only short periods of time to perform a given required task, and when the task was finished, moved to the next location requiring the same kind of task – even greater decentralization. Eventually, liquid fuel-driven reciprocating engines were invented and developed. These engines weighed significantly less and were more efficient, providing greater decentralization and mobility and power. The energy source for these devices – hydrocarbons – provided dramatically more energy per pound and volume. Therefore, very modest volumes for storage (fuel tanks) were required as compared to the same energy storage capacity required for hydrocarbons, i.e., wood or coal. Further, and most significant, liquid fuels were more easily transported from their sources to the locale of use, which also decreased costs. Greater decentralization was becoming the cornerstone of great economies. Improvements and applications abounded, and with the development of the electric motor came greater efficiency and a further reduction in size. Electric motors were first driven by batteries and/or nearby generators – centralization. At the turn of the century, electricity was starting to become available over long-distance power lines, making it possible to locate generators hundreds of miles from their end users – energy decentralization – i.e., hydroelectric plants conveniently located at a water source. Perhaps the greatest example of decentralization is the internet! The internet represents a level of decentralization that was unimaginable to science just decades ago.  It distributes intellectual energy through extremely efficient electronic mechanisms to anybody who wishes to participate. 

 

Other significant aspects of the revolution was the development of the factory system. One facility now housed the workers, machinery, and the entire manufacturing procedure. This improved process management and quality control. The system also created the division of labor, a method of worker specialization, thus improving the efficiency and reliability of a given task. This eventually led to the assembly line concept; a method of manufacturing where a product moves from place to place, adding parts sequentially, providing manufacturing speed, convenience, efficiency, and quality assurance. Ultimately, this made mass production viable and dominant. This brought  down the cost of goods significantly, resulting in a larger market.


The changes in social behavior were and continue to be substantial. Growth in the distribution of wealth, along with labor organizations (unions) further expanded markets, including the growth of international markets. Significant agrarian populations moved to cities, redefining how countries were run and the types of local government required. This generated new types of authority that required defining standards that could be applied universally. This brought users’ expectations to a new level. What the user paid for was what the user expected. The product’s appearance, fit, and function had become predictable and dependable.

 

Who owned what had changed? The concept of a single owner in charge of and benefiting from everything that was successful gave way to shared ownership in the form of stocks or their equivalent. Stocks became owned by many individuals and institutions. This came with great social change as the middle class expanded, providing access to more costly advocacy activity. Institutions benefited from partial ownership as a risk-reduction strategy. Worker skills that traditionally focused on craftworker skills moved towards intellectual capability requiring expanded education. Again, society was redefined. A social re-layering because of increased income levels. As this was happening, the idea of better safety emerged as a part of everything. From cars to climate, banking to stock markets, food to restaurants, medicine to patients, and on and on became the additional responsibilities of government.  All of this has been happening at an ever-increasing pace, resulting in a significant negative psychological effect on society. Paralleling this has been the military-industrial complex with its ever-increasing capabilities and economic overreach weakening social and economic stability, echoing President Eisenhower’s warning in the 1950s.   

 

Historians expanded their assessment of the Industrial Revolution, often referred to as the second Industrial Revolution. Built on all that had previously occurred, a new door opened. This involved technical, social, safety, and financial endeavors. As an example, the revealed knowledge of the atomic structure of materials made it possible to create synthetic materials. Plastics, a ubiquitous term for all kinds of synthetic materials made from organic substances such as oil, became the father of all kinds of new products. Most are polymers that are shapable under midrange heat. NASA became an instigator of stronger, lighter metals, rare earth applications, new alloys, and a whole new family of substances that could withstand extreme heat and pressure, such as oregano-silicon materials. This was driven by the challenges of space. NASA’s progress was, as designed by the government, available to the public. The pharmaceutical industry joined the synthetic revolution in pursuit of expanded human and animal treatment possibilities with growing success. As this was occurring, automation grew, with some industries becoming almost completely dominated by mechanized technology. Driving this was the invention of semiconductors. This provided the incredible availability of computers.  There is almost nothing in any industry that does not use computerization.

 

The following names are most important contributors to the twentieth and twenty-first century academic and economic academies: Einstein, Hubble, Edison, Oppenheimer, Tesla, Wright Bros., Watson, Freud, Curie, Perelman, Pavlov, Bohr, Goodall, Hawking, Walton. Cockcroft, Chadwick, Marsh, Turning Wilson, Gates, and Jobs. They have each taken part in building the foundation where humans are now creating machines that exceed the limitations of the human brain. They are Newtonian revolutionaries. As human evolution proceeds, Newtonian evolution is replacing Darwin. Darwinian evolution is too slow; Newtonianism  is not. 

 

When examining the ever-ongoing developments in science, it is important to consider the impact of Planck and Einstein. Planck changed classical physics in one unplanned foul swoop with his description of energy as being small wavelength packets, which became known as quanta, which upset the idea of energy particles. Einstein further upset classical physics with his theories of relativity.  Exactness of the classical age was no longer exact but became and is now a probabilistic genre. As the twentieth century progressed,  our understanding of the contents, behavior, and architecture of the atom unfolded. From the first ideas of an electron, neutron, and proton, there grew a family of infinitesimally small particles, each with a function. Another unproven theory suggests that one-dimensional strings, rather than particles, vibrate at different frequencies to produce their distinct functions. Similar to how different notes are generated by a guitar. Scientists given the functions strange names like quarks, neutrinos, Higgs bosons, etc. in what became known as the Standard Model. The Standard Model recognizes three of the four known forces in the universe: electromagnetic, weak, and strong forces but does not include gravity. Even our understanding of the atom is a matter of probabilities. While this frustrates the intuitive nature of human beings, it does not eliminate the impulse to understand, develop, improve, and invent. The goal of mathematically integrating quantum physics and relativity remains elusive. This is one of the greatest frustrations in modern-day physics. Relativity is a continuous theory of space and time, while quantum physics is about uncertainty, probability, different quantum states – the measurement of which is a prediction, not an absolute.     

We began as a toolmaker but went on to create things that adapt, defend, communicate, improve, and/or change our environment – technology. The success of each of these was and depends on our understanding of truth for each reality. We are driven first by the requirement of survival, and then by the elements of our social and cultural needs. To succeed, new ideas need social and cultural acceptance. History is full of good ideas that failed because of bad timing. Perhaps it is just a matter of fate. Leonardo invented so many ideas that he could only put on paper. The technology required was not available; he was too early. 

 

But when things are ready, advancements happen. GPS replaced centuries of dead reckoning and celestial navigation in months. Everybody with a smartphone has access to high-resolution navigation. We often fail to understand a technology’s reality, but will proceed because of the outcome. Wine making involves chemical changes that enhance the product, yet the exact physics and chemistry behind it remain unknown. The excellence of Stradivarius violins remains a scientific mystery that cannot be replicated today. One cannot interpret the history of technology through the lens of modern progress. Consider the following:

 

“The digital world has little patience for wisdom; its values are shaped by approbation, not introspection. It inherently challenges the Enlightenment proposition that reason is the most important element of consciousness.”

 

 The there is this . . . .

 

 “. . . there is more than an element of irrationality in the contemporary dilemma of a highly technological society contemplating the likelihood that it will use its sophisticated techniques in order to accomplish its own destruction.”[4]

 

Though there were many early attempts, William Burroughs and Frank Baldwin invented and perfected a practical adding machine in 1887.  It made the laborious matter of addition and subtraction an incredible labor-saving task as well as reducing the costly, error-prone manual system. By the twenties, currency symbols, paper printing, and multiplication were included. It made human arithmetic tasks so much more efficient and reliable. Then the computer arrived with the ability to be programmed for a specific function. There were no limitations on functionality, only the challenge of a workable and reliable algorithm to base the program on. This invention could solve all kinds of problems faster than the human brain. The question began to arise: Could machines have cognitive abilities? Could they think, have opinions, be intelligent? Alan Turning, the code breaker of World War 2, came up with an answer. He bypassed four thousand years of debate defining intelligence with a very simple test: If a computer’s output was indistinguishable from an equivalent human output, that output demonstrated intelligence - machine intelligence. This test only considered performance; it ignored philosophy and science. As of 2025, we have not achieved a facsimile of machine-driven human thought. There have recently been hints that a breakthrough in this barrier might happen. 

 

Time has moved on, and advancements in semiconductor technology have given us new tools. These tools allow us to design machines that do not require the same precision code of  computer technology, where a software program to be successful requires algorithms that are absolutely precise. The computer technology of the past sixty years has enabled us to organize and solve vast and complex problems faster and faster with each technological improvement.  AI has mastered machine learning. But the inputs were always human-generated data. For instance, researchers designed a program by inputting all possible chess moves and all known winning strategies. It made for a very formidable opponent challenging a human chess master. It could combine all known moves with all known winning strategies and create new winning strategies. The machine can outperform nearly any human, but cannot decide whether it wants to play the game or know how well it played. 

 

Researchers realized that an alternative approach was required, one that would allow machines to learn on their own. In short, a conceptual shift occurred: We went from attempting to encode human-distilled insights into machines to delegating some or all of the learning process itself to the machines. In the 1990s, a set of renegade researchers set aside many of the earlier era’s assumptions, shifting their focus to machine learning. While machine learning dated back to the 1950s, new advances enabled better solutions because the learning process became better understood. 

 

Machine learning has advanced to the point where the ambiguities that stymied the existing algorithms are no longer barriers. Machine learning provides a means to measuring outcomes against previous outcomes and therefore diagnosing what made the difference. Then applying this learned information to improve outcomes. For example, to identify a human eye as opposed to any other mammal’s eye using the traditional method meant inserting a group of images and looking for a match. The traditional Greek road to reality. The machine learning technique involves inserting a group of human eye images in various states with no effort to have an exact match for the target. Then, having the system assemble the various characteristics of the human eye, not perfection but approximations. From this, the system learns what characteristics  must be present to be a human eye as opposed to any other kind of eye and nothing else. Thus, the system has learned what unique elements make up the human eye. So much for rule-based methods. The system uses approximation instead. Furthermore, continued use makes the system ‘smarter and smarter.’  

 

But the challenge of programs that program themselves remains not quite available. To achieve such a goal requires machine-cognitive functionality. Recent advancements indicate we are observing significant progress towards such functionality. Existing AI outstrips human abilities in dealing with enormous and varied volumes of data. AI can recognize patterns in the fog of data tsunamis that no human can. This is happening at  speeds that were considered impossible a short time ago.

 

The complexity and enormity of so much data needed a new technique. Perhaps a more effective approach would be to apply the architecture of the human brain, which deals with complex body functions in parallel to regulate the body. While the conscience human being thinks serially, one step at a time. This led to an alternative approach for the design of digital architecture – the neural network – or more correctly, an artificial neural network (ANN). Researchers designed it using a rudimentary approximation of the architectural of the human brain, which programs data by connecting neurons with multiple connections to other neurons called synapses. Synapses have the property of variable energy strength for a given connection. A neuron is something like a transistor, but where a transistor has only two states – on or off – a neuron has multiple states and multiple connections, therefore, has an analog capability, which leads to greater computing power. A neural network is designed to set appropriate connections between the nodes (a crude approximation of a neuron) and to set the required energy strength of each connection with other neurons (the approximate equivalent of a synapse). The ANN uses a multilayer concept of interconnections at most every level. There were many variants created, but a successful generalized concept evolved.

 

Some layers have pre-designed data for starting the process;  i.e. pattern recognition. This is the essential idea for deep learning and a variant called ‘end to end stochastic gradient descent’ - a method that samples random examples at each iteration that is the dominant training technique in use today. It is mathematically a form of polynomial regression. Some very advanced concepts are currently in development. Particularly interesting is selective attention – the “cocktail party effect”. The ability to carry on a conversation successfully with another person in a crowded, noisy environment. This is the ability to detect specific data of interest in a data environment filled with similar data but not the data of interest. Deep learning technology can detect those veiled connections and then use them to find the desired goals - useful truths. 

 

Deep learning has three forms of machine learning.

1.    Supervised Learning uses data containing similar examples of the subject of interest.  The system requires identification (labeling) of each example. 

2.    Unsupervised Learning uses unlabeled data - usually in bulk - that detects similarity as in a pattern, group, or an anomaly having no information regarding the subject of the desired outcome. Cluster interference and crossover are common problems.

3.    Reinforcement Learning uses outcome feedback - good, bad or indifferent - to attain a designated goal. The learning process uses the concept of maximization to achieve better outcomes by actively using the good feedback data as it is recognized and builds.

 

With continued use, the deep learning network continues to receive more data, sharpening the strengths of the inherent relationships and therefore improving the precision of the outcome. But the data volume and quality must always be held to the highest standards. Since humans are responsible for choosing and inputting the data, human bias is always lurking. Bias can either be direct or subtitle. The greatest threat is the veiled subtlety of human behavior affecting the outcome invisibly.

 

AI is incapable of any philosophical consideration of what it is doing. Therefore, the perception of ethics and integrity is absent. The responsibility for the nature of the outcomes are the human beings using AI. That burdens them with the process of monitoring with integrity.  Those who would use AI for unscrupulous purposes expose the dark side of this technology. This holds true for all technologies; however, AI’s impact on outcomes can be concealed, preventing its recognition. The many uses of AI cover an overwhelming spectrum of possibilities. Consider such things as health, war, diplomacy, business, stocks, sports, and on and on because it supplements what the human brain limitations are

for these same issues.  Another aspect to consider is meaningful relativity.  How does AI decide a result is no longer worth considering because it has a meaningless outcome or the outcome is good enough and therefore, not worthwhile continuing? 

 

The long history of humans determining truth has been through perception, observation, experimenting, and reasoning, and finally, verification.  Machines that learn are working together with humans to understand reality – a partnership of sorts. We are using technology to help us where we have limitations. The question painfully arises: Will machines eventually determine all reality because, in an evolutionary sense, we will have become obsolete?

 

On the application side, a challenge for AI are complex functions. For example, the mechanical facsimile of an extension of the human hand that currently exists, and functions correctly is being used for a sensitive application such as surgery on a person. The various motions and sensitivity required are not only massive but often conflicting. The mechanized hand is being used for something that needs continuous manipulation, or an adverse result will promptly occur. Unexpectedly, the system controlling the hand detects an object falling toward or near it. The system reacts defensively by retracting the hand from potential damage, interrupting the required continuous manipulation process, causing an adverse outcome of what the hand was doing.  Consider a working prosthetic human hand, which is currently being developed and showing promise. Contemplating the required computing power and energy power required for all the artificial hand functions could be too great and costly for widespread prosthetic use.   

 

All these things have become technically possible because of the development of greater capacity and faster chip technology. Hundreds of thousands of transistors on a chip have become billions of transistors on the same size chip. The computing and data quantities have grown significantly. This occurred in the 90s because engineers invented metal oxide semiconductor (MOS) technology.  A single transistor can be the size of a few atoms. Conventional transistors are either on or off – a one or a zero. MOS transistors exhibit a phenomenon known as the body effect, which can alter the switching levels. Under certain circumstances, analog properties can be added, making the MOS transistor even more powerful. 

 

As all this has been happening, the quantum computer has begun to appear. Quantum computers could change everything by orders of magnitude. But they are currently in the experimental phase. Quantum refers to a counterintuitive state of physics based on probability. The quantum computer changes the scale of computing to levels well beyond the fastest supercomputer in existence. A conventional computer’s basic unit of information is the bit - a one or a zero.  For example, four bits 24 can be in one of 16 different states of 0 or 1 representing one of 16 different numbers. In a quantum device, the basic unit of information is a qubit. A four-cubit represents the same sixteen different numbers, but represents them all at the same time. Adding additional cubits in a chain, a process called entanglement, increases the capability exponentially. The power of this phenomenon is astonishing. Consider a chain of 220 representing the number 1,098,576. Add one additional qubit, 221 and the number doubles to 2,097,152. Those are all the numbers available simultaneously. Changing the state of one of the qubits will change the state of the others simultaneously in an ordered and foreseeable way, increasing the speed and capacity (density) exponentially. Raising the number of qubits to several hundred generates a number exceeding all the atoms in the known universe. What that really means is a conundrum for mathematicians, physicists, and the entire scientific and philosophical community. Generating reliable qubits is a major problem, and isolating them in a0 physical entity is very difficult. Without effective isolation, the problem of noise arises that interferes with reliable outputs. Eventually, most scientists believe it will be accomplished.

 

The question of how we as humans can determine the truth of a reality led to accepting  the limitations of our brain power. What stopped us was how to process a lot of data fast. It became apparent we were at a halt. The march forward of how to deal with this one of intellectual conundrum questioned a fundamental issue: Are we at our limit or only at a delay? It was not an insignificant problem. We know of no way to significantly increase the limited processing speed and capacity of our brains.  It became apparent that the path forward led to the great tool of the human mind, the ability to invent, which is new technology. To use new technology as a substitute for the limitations of human perception  and reason was a step into a new future based on a new reality. Philosophically, this was new ground. Therefore, we have to be willing to share the responsibility with a machine. The determination of truth depends on an inorganic mechanism. 

 

Darwinian evolution is too slow. Our chosen path is Newtonian evolution because of its efficiency and rapid progress.  Will we create a new form of a human being with our technology? That puts us in Genesis territory. “God created mankind in his own image, in the image of God …” That brings up the question: With all this technology, will the human species create a new man in man’s image? 

 

It took approximately fifty years and a lot of trial and error to develop semiconductor technology capable of beginning the path towards machine intelligence. We call it artificial intelligence, but it’s actually processed data - neither artificial nor intelligent at present.  The current state of AI generates useful information, and its ‘intelligence’ is human directed. The question will be: Will someday arrive when it will be self-directing? No need for human direction. There are hints it will. This will be the challenge of the partnership: who will be in charge?  What will our world be like with a fully implemented AI?. This will be full integration into our technology and culture. Will AI be the all-inclusive judge of reality? Within this highly mechanized environment, where is philosophy? Where is the joy of dialogue? Where are we in the human comedy? 

 

The earth has approximately nine million species, with somewhere over two million living in the ocean.[5]   We see ourselves as the dominant life form, perhaps for a slight moment in Earth time. But in that slight moment, we may destroy the Earth as we know it. The arbiters of truth have been in the hands of two entities until now: human reason and religious faith. AI is the new entity; where there were two, there are now three. AI, with its incredible speed, examining vast quantities of data, can ‘see’ relationships and patterns totally invisible to humans. Further, as it learns, it gains the ability to predict outcomes.

 

Will AI discover unknown realities? Does that suggest AI will be the ultimate arbiter of what is true about those newly revealed realities? There is every reason based on early successes in the pharmaceutical, entertainment, and astrophysics fields, that there is reason to believe that much will depend heavily on AI. It will come with dangers that should caution all users dramatically. Privacy issues, liabilities, incomplete understandings, and poor definitions can cause significant havoc. We are homo sapiens with a primordial history of conflict, which has not diminished to this day. Then again, it could create opportunities for humanity that would improve the human future and the geographic world. Add to that an understanding of the cosmos that exceeds our present ability to imagine. Perhaps it could bring us to the place where we would finally realize war is obsolete. 

 

Humans have spent at least four thousand years trying to understand and quantify human reason. As previously mentioned, machines can perform complex tasks with no philosophical basis or conscience. This ultimately will have its effect on human identity. This will probably remodel everything about us socially. We will develop, for better or worse, personal relationships with machines. That is already happening with the smartphone. People, particularly young people, use texting as a substitute for a normal conversation. The availability of almost any information about almost anything is available online. That should make all of us using this capability better educated and, as a result, more aware of the result of our actions. We easily respond to biases that  distort our interpretation of reality. Too often, this leads to conflict and worse, war. Can AI mitigate these tendencies with its expansive resources and analytic capabilities? Bad actors can distort reality with deliberate misinformation or meaningless information. This can be on the bench of justice or at the center of our politics. To proceed fairly and pleasantly, society will have to learn how to regulate and limit AI at every level responsively. The continuation of democracy will depend on it. It will permeate everything from science to the arts to politics, et al.  

 

Governments are already trying to protect the dangers of AI from society. The EU has passed laws with this purpose in mind. Our best American academic entities are forming rules-based protection methodologies. At the edge, individual spiritual knowledge encounters traditional beliefs in an effort to comprehend reality in ways that will encourage fairness and justice. We’ve always had organizational templates that started with the Greeks and probably the Sumerians  as a guide for aiding in reality arbitration. AI has yielded no meaningful guidance so far.

 

 “The AI revolution will occur more quickly than most humans expect. Unless we develop new concepts to explain, interpret, and organize its consequent transformations, we will be unprepared to navigate it or its implications. Morally, philosophically, psychologically, practically—in every way—we find ourselves on the precipice of a new epoch. We must draw on our deepest resources - reason, faith, tradition, and technology—to adapt our relationship with reality so it remains human.”[6] 

 

What are the philosophical issues confronting the AI world? AI does not have the capability for self-reflection. Philosophers define philosophy as the study of knowledge, reality, and existence in relation to humankind. With AI, the understanding of each of these issues could become blurred by the enormity of the data and its unexpected revelations. Just as quantum science challenged the human inclination to be intuitive, AI could be or perhaps already is an equal disruptor. To see new things that are unexpectedly real when the existence in the ordered intuitive world was unknown will be a shock. Such events could erode thousands of years of intellectual development. Such a  collapse would probably lead to social division based on different interpretations. The catch is who the arbiter is, man or machine. Remember, humans will make the choices at the beginning. Giving control to the machines would reduce the human species to servitude. In a most  nefarious world, there might be no choice. The narcissistic tendencies of some political leaders may see AI as an extension of their power to control. It could almost offer for the worst of us the illusion of god-like powers. This is really nothing new. Certain Roman emperors, Egyptian kings, and others placed themselves in this category.  

 

There is no doubt there will be those who reject AI. The cost could be tragically high. Consider eliminating AI for health and transportation as a few examples. On the competitive human level, to reject the speed of learning to solve a problem would put the non-user out of the race.  AI achieves the ability to focus beyond human abilities. The complexity of so many possibilities is beyond human capabilities. We will attempt to solve them with the accepted limitations we are aware of or not aware of. These complexities are the stuff that impedes positive solutions to the myriad problems facing individuals, families, groups, populations, societies, and governments.  AI with its superior focus and ability to detect the patterns and relationships for a given problem and the resultant paths to a solution or solutions, could redefine human existence. If we accept this powerful tool without constraints, we might become a totally Newtonian society. Humanity is then partially machine-driven, or eventually all machine-driven. If that were to occur, humanity would face extinction by its own hand—rendering our existence and purpose obsolete,

 

The following, in the AI journey, is useful in aiding those who are unfamiliar with a few technical terms:  

 

CPU - The Central Processing Unit is an electronic circuit that performs general purpose instructions for a computer in sequence (serially) .

 

GPU – The Graphics Processing Unit is a specialized electronic circuit designed to execute complex computations sequentially, primarily for processing and rendering images, animations, and videos. 

 

GPU CORE - A GPU Core are individual processors located within one processing unit that operate in parallel, accelerating the speed of complex large calculations. This differs from the sequential processing method used in conventional chips.

 

NODE - A chip Node once referred to the specifications and size of each transistor on a chip, such as gate length and pitch, but has become more about the manufacturing process and the architecture used to create the chip. Smaller implies faster and more energy efficient. The typical means of identification is nanometers, and it stands for size and feature details of each transistor (1 nanometer ≈ 0.00000003937 inches).

 

 

Many companies built the foundation for the potential of AI and most are no longer relevant or in business. Remmington Rand produced the first commercial computer in the U.S. - UNIVAC I in 1951. The Census Bureau used it and famously predicted the 1952 presidential election. Unisys, formed in 1986, became a major player in the late 80s. Burroughs Corporation introduced the B5000 computer in 1962 and eventually became part of Unisys. They are in business but do not lead. National Cash Register introduced computing for cash registers for the first time in 1952. Control Data Corporation became one of the first to use scientific computing and built the first successful supercomputer, the CDC 6000, in 1964.  Honeywell brought significant computing to the military in the late 50s. They managed to create a computer small enough to install in aircraft and missiles.  They are one of the few that remain relevant today, focusing on automation and aerospace. RCA introduced the Spectra series of computers in the late 1950s.  In 1952, they produced a large-scale computer - Bizmax - using vacuum tubes designed for business and the military. In the late 50s, they made the first transistorized computer – the RCA 501. Remarkably, in retrospect, its modular design and centralized control architecture are still relevant today.  General Electric produced the GE-225 for business and scientific use in the late 60s. The company produced various computers intended for business use. They were the first to become involved with character recognition, batch and transaction processing, and time sharing. Developing the concept of Unix, online services, and cloud computing was their doing. GE always had a significant focus on financial information services as their strategic foundation. This all occurred by 1970, when they sold their computer division to Honeywell.  None of these companies are AI leaders. AI started 75 years ago and didn’t suddenly pop up recently.

 

As part of this paper, we must mention what many scientists believe is the most important invention of the twentieth century, the Laser, the Laser diode and its relation, the LED. The significance of these three inventions is that they are a new way of making light from electronic energy. Before, light was produced by fire. Either by burning something in the atmosphere or heating something from electricity in an inert environment like a filament in a vacuum or some inert gas – the light bulb for example. The laser was first anticipated by Einstein. What exactly is a Laser? A Laser (Light Amplification by Stimulated Emission of Radiation) is a device that forces monochromatic beams of light to align like soldiers marching in syncopation - coherence. The net result is that all the energy is concentrated in very tiny areas, therefore able to  perform at power levels never before available.  Lasers are made from either solids, liquids, or gases referred to as stimulated emission. Laser diodes are capable of the same thing but use semiconductor technology also employing the stimulated emission principle. LEDs also produce light using semiconductor technology but use the mechanism of electroluminescence. LEDs are not monochromatic and can produce many colors. LED light is not capable of coherence but is very efficient, using very little power. Further, they can be produced for pennies. Our electronic displays, from home TVs, smartphones, computer screens, to huge live billboards are based on LEDs.

 

As the future is considered, a problem to be aware of is the amount of energy required to support data processing centers. Big AI firms require ever-increasing power to run          servers. They are currently spending hundreds of billions of dollars creating, expanding, and upgrading these processing centers. An odd aspect of these centers is that they are physically small, but use extreme power levels for their size. As a result, they disrupt power distribution patterns of power-generating entities and driving up  infrastructure costs to very costly levels. Another serious problem is they use huge amounts of water for cooling. Some of the AI companies are already building or considering building their own power-generating facilities. We’re talking about data centers in the thousands. One of the dark size of this need for so much power is the contamination by product – carbon monoxide. There is one facility being built in western Pennsylvania which will produce four gigawatts of power from natural gas – a vast source of environmental contamination. Some of the more responsible companies are building renewable and sustainable energy sources for their energy such as solar, geothermal, wind, and ocean tides.

 

At this time in its evolution, even the chip set instruction architecture (ISA) is at issue. The reasons are simple: open-source flexibility, cost and energy efficiency, no licensing fees, heterogeneous computing (serial and parallel), and industry support. Fully implemented AI makes possible the control of extremely complicated machines beyond anything ever made.

 

The current top of the list is biped life-sized humanoid robots performing a myriad of household and/or industrial tasks. From gathering the laundry to taking care of grandma’s needs, to industry for arranging ordered and/or random tasks, and on and on. This is no longer an idea but a reality. There are companies in the world that are beginning to produce these machines. UBTECH Robotics, a Chinese  company, is aiming to produce one thousand such robots by 2026. They have demonstrated working prototypes. Taska CX, an American company, is producing a prosthetic hand with capacitive touch fingers that lets the user interact with a touch screen on their phone. WalkON is producing prosthetic (robotic) legs for paraplegics to walk using their crutches as the controller. The system learns what works best for that person and creates the best walking pattern. NanoOne produces a bio-printer that creates artificial human tissue, which can be combined with real human tissue for drug testing, disease modeling, and other biological research. Designer drugs, driverless vehicles of every sort, virtual entertainment anywhere are all on the carte du jour. These are examples of just a few of what’s happening because of the AI technical evolution. 

 

Parallel processing redefines the paradigm of computerization. It is the foundation of LLM (Large Language Model). Parallel processing provides the ability to deal with the ocean of data about almost everything. The concept of “everything” is a daunting concept, but there is one requirement: It has to be digitizable. How is it possible for the human brain to think or process in such terms? It can’t. But AI can search through oceans of data at blinding speeds looking for similarities, relationships, patterns, etc. and, when finding them, create order and report. The scale has changed.

 

When and if quantum computing arrives, the paradigm will change even more drastically. Quantum computers will be super-computing everywhere. Nothing will be sacred. There will be no secrets. Encryption will be folly. But the problem of quantum decoherence is a fundamental physics problem that is very difficult to overcome. So many of the largest technology companies report quantum devices, but every one of them suffers from decoherence and requires massive error correction, making them currently very expensive and impractical.

 

The common use of ChatGPT or other AI tools for so many things has already become an intellectual crutch. AI is being used to doing things that the user should be doing. For example; What is being lost when a term paper is created by AI? It is a person’s creativity and originality. The spark that makes us capable of a host of human accomplishments – books, articles, music, art, films. plays, technology, justice, solutions for survival, a new recipe, etc.     

 

Conclusion  

 What then should our relationship with AI be? Should we embrace it, empower it, or partner with it? How can we resist its power to control and use large and/or complex data? We know well that the distribution of certain information, worthless information, and bad information can be destructive and divisive. In a world increasingly dependent on AI, it will be essential to establish a common set of rules and standards governing its development, deployment, and sharing. Failure to agree on a common form of behavior with a means of observation and enforcement will generate insecurity and eventually conflict. There can be no opacity. Humanity has failed to achieve a governing equivalent between the nationalities of the earth. Some progress has appeared regionally; NATO is an example. Most of the world has accepted measurement and time standards. Science and business have areas where they have complied, such as the International Patent and Trademark Treaties, with good compliance. Unfortunately, China is a member but does not comply. There are beginnings such as the Global Alliance for Artificial Intelligence (GAFAI), the European Association for AI (EurAI), and Partnership on AI (100+ members).  The UN is a template for this but has succeeded and failed too many times. Ignoring the danger of AI’s unregulated capabilities could be fatal to humanity. 

 

The four or five-thousand-year-old trail of the common thread in pursuit of reality and truth has brought us to an upsetting moment of what we are and what we are not. We must understand and accept that a machine that can exceed the limitations of the human brain challenges the notion of our importance in our tiny place and microscopic moment in the cosmos. 

 

Where do we go from here? What is the future? One thing is certain; we don’t know. But we know that, whatever it is, it will be on an exponential calendar. Human beings in their conscience state think linearly and serially. The events in our world and the universe are exponential. We think in a sequence of logical steps, progressing from one point to another.  Our solutions for what to eat for dinner or a differential equation are about a series of decisions connecting one thought after another to reach the truth about a reality. Everything else is exponential. Our technical progress is exponential. We went from horses to cars in 15 years. In 1946, a passenger plane’s travel speed was under 300mph. 10 years later, jet planes were at almost 600mph. The desktop computer first appeared in the late seventies. The first smartphone appeared in 1994. Everything that was available in a desktop computer is now in the smartphone plus capabilities we never imagined. Today, it is something we keep in our pockets, and it is everywhere.

 

Progress doubles in ever shorter times. Think of what that implies using the old exponential example with pennies. You receive 1 penny on day one, 2 pennies on day two, 4 pennies on day three. 8 pennies on day four, 16 pennies on day five, and so on, doubling the number of pennies each time for just one thirty-day month. You end up with over $5.3MM in pennies received for that last day alone plus the $5.2MM you accumulated from each day you received pennies for the thirty days. The total is approximately $10.5MM.

 

Social media platforms are shaping our digital world and society. Such programs as Facebook, YouTube, TikTok, Instagram, X, Reddit, etc. dispense information everywhere at their discretion. They can shape, stop, edit, or create their own content with no form of intellectual or legal liability. There are video conferencing services such as  ZOOM, Meet, iMessage, FaceTime, etc. redefining the styles and rules of dialogue – a form of virtual reality. This is where humans interact with other humans virtually – electronically. Humans respond to other humans in a vast spectrum of ways that are filtered out by using electronic sight and sound. Three-dimensional sight, sound, smell, the environment, etc. are absent. We can legitimately conclude that we lose something important with so much missing in this monolithic digital arena. Combine this with Dark AI and the dogs of hell will be let loose. The other dark side of AI is warfare, which enables the enhancement of weaponry and cyber capabilities, as well as tactical, defensive, and strategic applications.  We must continually emphasize that the pace of learning and discovery in the future will be exponential—demanding adaptive, forward-thinking frameworks that grow as rapidly as the technologies they govern. There are many examples of failure because of their reliance on linear thinking - Eastman Kodak, General Electric, RCA, BlackBerry, Motorola, Polaroid, Sears, to mention a few.

 

Both Eastern and Western philosophical traditions continue to influence modern thought, spirituality, ethics, and practices. Eastern philosophical schools often integrate spiritual and religious elements, while Western schools are more secular. Both traditions have made profound contributions to global intellectual history, enriching our collective understanding of ourselves and the world. We can see their influence in various fields, such as psychology, ethics, social justice movements, and cosmology. The Socratic academy emphasizes rationality, logic, and empirical evidence. By understanding these differences, we gain a deeper appreciation for the diversity of human thought and the various paths towards understanding existence and morality. An interesting example of mixing different disciplines in education is the recent trend of some physicists joining philosophical departments as opposed to their home department, physics.

 

The development of the internet gave us ubiquitous connection to almost everything. It eliminated so many elements of distance. Communication with others, with its inherent limitations of the electronic wall filtering many of the elements of human behavior has become the standard. And not necessarily a better solution. It has redefined the social norm with its imposed limitations. Then where is AI with its ubiquitous power to once again affect almost everything? AI is not only a change, but a profound event in human technical evolution, equivalent to the Industrial Revolution, or perhaps greater. Mix that with super-computing – quantum computing – and all society will change into a very different future. Conjectures of what  that tomorrow will be remain in the miasma of a thousand uncertainties surrounded by good and evil fantasies. 

 

Will a dependence on AI destroy the element of trust required to build and maintain a moral society? There are unwritten agreements in a country like ours that make a positive and a good existence possible. For example, there are certain lines that we have agreed never to cross, such as the use of certain derogatory terms. Even if they are not written in the law, they exist. AI has no sense of acceptable social behavior elements, and the results could be logically perfect but socially destructive. We live with many who embrace convictions of an authoritarian and reject empirical facts. In doing so, they  become slaves to the provider of ignorance.

 

The common thread in pursuing truth about reality is ever with us as our burden and equally our benefit. Scientific advances free of ethics is a path to destruction. As I reflect on this common thread journey, I wonder if we have opened  Pandora’s box. According to the legend, Pandora opened a box containing the misery, evil, and curses for all of humanity. Using  AI is already or will be everywhere. It is the universal tool to supplement our brain’s limitations. We are probably dancing on the edge of the pit of disaster. We must join hands to keep us from the fall. If we do the right thing, our future could be amazing.

 

Nous devons être d‘accorde. We must agree.

 

 

AI_articleVS1.1_20251107.docx

 

 

 


[1] Naima Ferdous September 13, 2022

[4] The Age of AI: And Our Human Future (p. 34). Little, Brown and Company. 

[6] The Age of AI: And Our Human Future (p. 168). Little, Brown and Company. Kindle Edition.

 

Comments


bottom of page