More Details

Limits of Classical Computing

The past five decades have witnessed a dramatic acceleration of computer evolution to multi-billion-transistor chips, able to collectively execute multi exaOPS (multi 1018 operations per second) when configured into a massively parallel processing supercomputer architecture. However, even today’s fastest supercomputers are ultimately classically-constrained, energy-inefficient sequential processors, both at component and system levels.

‘Moore’s law’—the doubling of transistors on chip every 18 months—has propelled classical (non-quantum) computing evolution for the past half century. During the past three decades, Dennard scaling enabled more, faster, and increasingly energy-efficient transistors and applications with each succeeding processor generation.

A growing failure of Dennard scaling and Moore’s law, to which the shift to multicore since 2005 was partially a response, has shown signs of limiting multicore as well. Contributing factors to the collapse of classical exponential scaling include power and parallelism constraints encountered at small dimensions, ‘dark silicon’, and sequential processing bottlenecks.

Continuing exponential improvements in classical processor speed, memory and integration under Dennard scaling and ‘Moore’s law’ are not sustainable due to quantum effects. Components and circuits today are fabricated and operate at intrinsically quantum mechanical dimensions—the molecular (nano), atomic (ångström), and sub-atomic (pico) scales.

In order to sufficiently enhance computer and network processing to continually meet worldwide computational and availability demands, the following conditions must be met:

  • Computer and network components and systems must be driven at increasingly higher clock frequencies within shrinking chip geometries and diminishing memory latencies—a quest that has led to the collapse of Dennard scaling and ‘Moore’s law’ within classical space-time;
  • Computer and network components and systems must be increasingly integrated due to the speed of light limitation while remaining within classical space-time symmetries;
  • Increasingly miniaturized components and systems need to be continually more energy-efficient, while avoiding serial architecture (Von Neumann) bottlenecks and resistance-capacitance delays, issues that are only temporarily delayed within classical parallel processing platforms.

When we extrapolate classical exponential scaling trends, we attain a limit of one atom per bit and efficient single electron transistor by the 2020 timeframe. Prior to these levels, it becomes necessary to use quantum effects to properly address worldwide compute-intensive evolution.

Emergence of Quantum Computing

Of all the candidate technologies that continue to scale well beyond the current classical era, quantum logic has one unique feature—it is not contained by classical space-time physics. ‘Moore’s Law’ is exponential; any classical approach demands exponential increases in space or time. Even the Avogadro’s number of elements in a molecular computer is quickly limited by the size of the exponential problem. Quantum computing and networking access Hilbert space, the one exponential resource that has been untapped for computation.

Quantum computers, in various stages of R&D, operate according to the rules of quantum mechanics which govern the world of the very small—the waves and particles intrinsic to the nano-scale (molecular scale, 10-9 meter), ångström-scale (atomic scale, 10-10 meter) and pico-scale (10-12 meter, the domain of electrons and photons). Perhaps the most striking characteristic of quantum computers is that elementary particles can persist in two or more (2n) states at once, making possible processing units (quantum bits, or qubits) that are exponentially more efficient than any conventional, “classical” computer could ever be.

Quantum computers operate in truly parallel fashion, with sequential and simultaneous computing co-structured into their very nature. Quantum computing simultaneity ensures that all computational pathways are pursued at once, exponentially eclipsing serial processing bottlenecks encountered in conventional (classical) computers. In other words, each quantum processing operation acts on all system states simultaneously. Therefore, one machine cycle, one ‘tick of the quantum computer clock,’ computes not just on one machine state (as is true of non-quantum, serial computers), but on all possible instruction states at once.

Quantum Computing Implications

Quantum computers may one day solve problems that conventional computers could never manage, and in a fraction of the time, including:

  • Quantum Search—Searching the Internet, Surface Web, Deep Web, private content, and database repositories with far greater contextual precision than is imaginable today, even using massively parallel (non-quantum) systems, via quantum search engine applications that examine and update all possible locations and contexts within several seconds to a few minutes.
  • Quantum Simulation—Simulating the intricacies of complex classical and quantum systems at scales unattainable by conventional computing technologies, leading directly to dramatic breakthroughs in cost-efficient and environment-friendly ageless materials based on optimum strength-to-weight ratios, atomic- and subatomic-scale architecture, electronic- and photonic-scale chip design (pico-electronics), one-step design-to-build quantum memory, and optimized design-to-construction of entire communities.
  • Quantum Database and Modeling—Modeling national and global economies based on continually refreshed contextual worldwide search results of billions of networked and standalone information and data repositories. Rapid and accurate weather and climate pattern forecasting.
  • Quantum Factoring—Factoring multi-digit numbers exponentially more rapidly than is currently possible with the best non-quantum methods, enabling timely access by authorized individuals to secure private information.
  • Quantum Cryptography—Ensuring protection of sensitive data and information through quantum cryptographic methods that reliably prevent unauthorized access through classical or quantum means, regardless of the scale of breadth and depth of brute-force or contextual compromise attempts.
  • Quantum Factoring, Order-Finding, Period-Finding, Fast-Fourier Transform—Performing global-scale calculations and worldwide database updates with contextual precision across a wide range of systems within a matter of minutes, regardless of the scale or order of distributed permutations involved.
  • Quantum Counting—Calculating all required solutions to any scale of presented problem within a tractable period of time, regardless of the dimensions involved.
  • Quantum Teleportation—Transcending speed-of-light and temporal computing boundaries at great distances using quantum entanglement properties associated with non-local connections.
  • Quantum Error Correction—Automatically fixing system errors while revealing nothing about ongoing quantum computations, thereby preserving the system in a state of quantum superposition—that is, a dynamic state of all computational possibilities.

The Cosmic Computer® and Cosmic Switchboard®

When we consider a quantum computational system of n quantum bits (qubits), we find the computational basis states of this system to be of the form ⎢χ1χ2 … χn 〉. Therefore, the system quantum state is specified by 2n probability amplitudes. For n greater than 500, this number is larger than the estimated number of atoms in the known physical universe.

The Cosmic Computer® and Cosmic Switchboard®, stationed at the level of the Unified Field, are found to be perpetually processing 2n amplitudes, even for systems that contain only a few hundred atoms, to say nothing of the massively parallel infinity-point calculations eternally proceeding behind the scenes to evolve and maintain all the Laws of Nature on every level of creation. We extrapolate that Nature maintains greater than 2300 calculations for every few hundred atoms throughout the entire universe.

The scale of Natural Law calculations of the Cosmic Computer® and Cosmic Switchboard® is estimated to extend exponentially beyond the atomic level when we shift our attention to the scales of the fundamental force particles (photons for the Electromagnetic Force; weak gauge bosons for the Weak Force; gluons for the Strong Force; gravitons for the Gravitational Force). The fundamental force computation density of the Cosmic Computer® and Cosmic Switchboard® is again extended hyper-exponentially at the superstring dimensions that pervade Planck scales of 10-33 centimeter and 10-43 second.

This book reveals the entire structure of the Cosmic Computer® and Cosmic Switchboard® to be pure, cosmic intelligence and has been identified by Maharishi Mahesh Yogi as integral to Maharishi Vedic Science in terms of the infinity-within-all-points and all-points-within-infinity cosmic computational foundation for perfection of evolution. We locate the Cosmic Computer® and Cosmic Switchboard® within the self-luminous junction point of the Hardware-Software Gap™, human physiology, and throughout every point of manifest creation. It is here that we discover that intelligence which is at the same time numeric and also with boundaries, where physical digits are connected to numeric digits, where the physical is expressed in terms of numbers.


Read About the Author.

COSMIC COMPUTER, COSMIC SWITCHBOARD, DIGITAL UNIVERSE, RAAM GATE, VEDACOM, and VEDIC COMPUTING are registered trademarks of Thomas J. Routt. GLOBAL INTERNETWORK, HARDWARE-SOFTWARE GAP, NETWORK-ON-A-CHIP, QUANTUM GAP, QUANTUM NETWORK ARCHITECTURE, QUANTUM SEARCH ENGINE, and VEDIC GAP are trademarks of Thomas J. Routt. Other brand and/or product names may be trademarks or registered trademarks of their respective owners.