|a 1 in this column means 10,000 (or 104)||a 1 in this column means 1,000 (or 103)||a 1 in this column means 100 (or 102)||a 1 in this column means 10 (or 101)||a 1 in this column means 1 (or 100)|
|a 1 in this column means 16 (or 24)||a 1 in this column means 8 (or 23)||a 1 in this column means 4 (or 22)||a 1 in this column means 2 (or 21)||a 1 in this column means 1 (or 20)|
|Powers of Ten||a 1 in this column means 1 (or 100)||a 1 in this column means 0.1 (or 10-1)||a 1 in this column means 0.01 (or 10-2)||a 1 in this column means 0.001 (or 10-3)||a 1 in this column means 0.0001 (or 10-4)|
|Decimal Fractions||equivalent to 1 / 1||equivalent to 1 / 10||equivalent to 1 / 100||equivalent to 1 / 1,000||equivalent to 1 / 10,000|
|6n - 1||6n + 1|
|6n - 1||6n + 1|
This paper reveals electromagnetic and physical vector properties discovered through the extended pattern analysis of the 5-step double-threaded helix prime number growth structure found to dominate the first 500 prime numbers. The electromagnetic sequence reversal patterns between the threads and their bonds are either end-to-end or 90-degrees. The bijective transforms and curl vectors guide us to a potential electromagnetic model of prime number growth behavior. This model is submitted as an initial method and model to place into electromagnetic and harmonic computational modeling tools by professionals in scientific computing fields.It is okay if that quote is not wholly clear. To summarize it simply: The prime number double helix is intimately connected with electricity.
|a 1 in this column means 81 (or 34)||a 1 in this column means 27 (or 33)||a 1 in this column means 9 (or 32)||a 1 in this column means 3 (or 31)||a 1 in this column means 1 (or 30)|
The development of balanced ternary machines has all but faded into a small footnote in the annals of computer history. And whilst research into memory cells able to efficiently represent three distinct states has been relatively minimal, there have been some efforts in this area.We have highlighted a particularly important part of the quotation in bold. The Flux Thruster is essentially a design for a logical element that does exactly this! Remember, it is bifilar. One conductive pathway is making a superconducting current that is rotating clockwise, and the other conductive pathway is making a superconducting current that is rotating counter-clockwise. This means that The Flux Thruster can have three unique states: one winding can have a current moving CW while the other is moving CCW, the situation can be inverted, or both of them can be off.
Particularly, researchers in Japan in the late 90's described the possibility of using a Josephson Junction to implement ternary logic. This could be achieved by circulating superconducting currents, either clockwise, counterclockwise, or off. They found that this gave the memory cells a "capability of high speed computation, low power consumption and very simple construction with less number of elements due to the ternary operation".
Guosong Liu, a neuroscientist at the Picower Center for Learning and Memory at MIT, reports new information on neuron design and function in the March 7 issue of Nature Neuroscience that he says could lead to new directions in how computers are made.The quote, "if any of the connections break, new ones automatically form to replace the old ones" is particularly interesting. With some other considerations, we can also design circuits that have this same property. The term for this is a "self-healing circuit". To quote the article "Creating Indestructible Self-Healing Circuits" by Kimm Fesenmaier:
While computers get faster all the time, they continue to lack any form of human intelligence. While a computer may beat us at balancing a checkbook or dominating a chessboard, it still cannot easily drive a car or carry on a conversation.
Computers lag in raw processing power--even the most powerful components are dwarfed by 100 billion brain cells--but their biggest deficit may be that they are designed without knowledge of how the brain itself computes.
While computers process information using a binary system of zeros and ones, the neuron, Liu discovered, communicates its electrical signals in trinary--utilizing not only zeros and ones, but also minus ones. This allows additional interactions to occur during processing. For instance, two signals can add together or cancel each other out, or different pieces of information can link up or try to override one another.
One reason the brain might need the extra complexity of another computation component is that it has the ability to ignore information when necessary; for instance, if you are concentrating on something, you can ignore your surroundings. "Computers don't ignore information," Liu said. "This is an evolutionary advantage that's unique to the brain."
Liu, associate professor of brain and cognitive sciences, said an important element of how brain circuits work involves wiring the correct positive, or "excitatory" wires, with the correct negative, or "inhibitory" wires. His work demonstrates that brain cells contain many individual processing modules that each collects a set number of excitatory and inhibitory inputs. When the two types of inputs are correctly connected together, powerful processing can occur at each module.
This work provides the first experimental evidence supporting a theory proposed more than 20 years ago by MIT neuroscientist Tomaso Poggio, the Eugene McDermott Professor in the Brain Sciences, in which he proposed that neurons use an excitatory/inhibitory form to process information.
By demonstrating the existence of tiny excitation/inhibition modules within brain cells, the work also addresses a huge question in neuroscience: What is the brain's transistor, or fundamental processing unit? For many years, neuroscientists believed that this basic unit of computing was the cell itself, which collects and processing signals from other cells. By showing that each cell is built from hundreds of tiny modules, each of which computes independently, Liu's work adds to a growing view that there might be something even smaller than the cell at the heart of computation.
Once all the modules have completed their processing, they funnel signals to the cell body, where all of the signals are integrated and passed on. "With cells composed of so many smaller computational parts, the complexity attributed to the nervous system begins to make more sense," Liu said.
Liu found that these microprocessors automatically form all along the surface of the cell as the brain develops. The modules also have their own built-in intelligence that seems to allow them to accommodate defects in the wiring or electrical storms in the circuitry: if any of the connections break, new ones automatically form to replace the old ones. If the positive, "excitatory" connections are overloading, new negative, "inhibitory" connections quickly form to balance out the signaling, immediately restoring the capacity to transmit information.
The discovery of this balancing act, which occurs repeatedly all over the cell, provides new insight into the mechanisms by which our neural circuits adapt to changing conditions.
This work is funded by the National Institutes of Health and the RIKEN-MIT Neuroscience Research Center.
Imagine that the chips in your smart phone or computer could repair and defend themselves on the fly, recovering in microseconds from problems ranging from less-than-ideal battery power to total transistor failure. It might sound like the stuff of science fiction, but a team of engineers at the California Institute of Technology (Caltech), for the first time ever, has developed just such self-healing integrated chips.The possibilities are quite exciting!