Papers
Topics
Authors
Recent
Search
2000 character limit reached

The nature of quantum parallel processing and its implications for coding in brain neural networks: a novel computational mechanism

Published 20 May 2025 in q-bio.NC | (2505.14503v1)

Abstract: Conventionally it is assumed that the nerve impulse is an electrical process based upon the observation that electrical stimuli produce an action potential as defined by Hodgkin Huxley (1952) (HH). Consequently, investigations into the computation of nerve impulses have almost universally been directed to electrically observed phenomenon. However, models of computation are fundamentally flawed and assume that an undiscovered timing system exists within the nervous system. In our view it is synchronisation of the action potential pulse (APPulse) that effects computation. The APPulse, a soliton pulse, is a novel purveyor of computation and is a quantum mechanical pulse: i.e. It is a non-Turing synchronised computational event. Furthermore, the APPulse computational interactions change frequencies measured in microseconds, rather than milliseconds, producing effective efficient computation. However, the HH action potential is a necessary component for entropy equilibrium, providing energy to open ion channels, but it is too slow to be functionally computational in a neural network. Here, we demonstrate that only quantum non-electrical soliton pulses converging to points of computation are the main computational structure with synaptic transmission occurring at slower millisecond speeds. Thus, the APPulse accompanying the action potential is the purveyor of computation; a novel computational mechanism, that is incompatible with Turing timed computation and AI.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.