The Multi-Biophysical Nature of Computation in Brain Neural Networks
William Winlow *![]()
, Andrew Simon Johnson
![]()
-
Dipartimento di Biologia, Università degli Studi di Napoli, Federico II, Via Cintia 26, 80126 Napoli, Italy
* Correspondence: William Winlow![]()
![]()
Academic Editor: Severn B. Churn
Special Issue: Initiation and Propagation of the Neural Message: Communication and Computation by Action Potentials
Received: August 04, 2025 | Accepted: March 17, 2026 | Published: April 13, 2026
OBM Neurobiology 2026, Volume 10, Issue 2, doi:10.21926/obm.neurobiol.2602331
Recommended citation: Winlow W, Johnson AS. The Multi-Biophysical Nature of Computation in Brain Neural Networks. OBM Neurobiology 2026; 10(2): 331; doi:10.21926/obm.neurobiol.2602331.
© 2026 by the authors. This is an open access article distributed under the conditions of the Creative Commons by Attribution License, which permits unrestricted use, distribution, and reproduction in any medium or format, provided the original work is correctly cited.
Abstract
Comprehending the nature of nerve communication is fundamental to our understanding of the functioning of nervous systems in general. The ionic mechanisms underlying action potentials in the squid giant axon were first described by Hodgkin and Huxley in 1952, and their findings have formed our orthodox view of how the physiological action potential functions. However, substantial evidence has now accumulated to show that the action potential is accompanied by a synchronized coupled soliton pressure pulse in the cell membrane, the action potential pulse (APPulse), which we have recently shown to have an essential function in computation. Computational models of the action potential usually describe it as a binary event. Still, we have shown that it must be a quantum ternary event known as the computational action potential, whose temporal fixed point is the threshold of the soliton, rather than the plastic action potential peak used in other models to facilitate meaningful computation. Here we argue that for computation to occur in neurons, it must do so at the location of convergences of neurons by frequency modulated quantum interference. The timing of frequency changes indicates that the threshold must activate in less than 10-6 s, much faster than that of synapses. APPulse in a brain neural network collide according to the latencies of the neurons and the distinct frequency patterns. Here, we review the interactions between the soliton and the ionic mechanisms known to be associated with the action potential. Elsewhere, we have demonstrated this type of frequency computation for the retina, in detail, and also provided an extensive analysis for computation for other brain neural networks. However, while the physiological action potential is important for neural connectivity, it is irrelevant to computational processes as the soliton part of the APPulse always facilitates this for computational timing and effectiveness.
Keywords
Nerve impulse; physiological action potential; soliton; action potential pulse; computational action potential; reverberatory circuits; brain neural networks
1. Introduction
The nature of the nerve impulse has been the subject of detailed research since the time of Sir Charles Sherrington in the 1890s [1] and Edgar Adrian [2], who were jointly awarded the Nobel Prize in Physiology or Medicine in 1932. Sherrington discovered the nature of reflexes, and Adrian provided evidence for the all-or-none law in nerves and muscles. However, it was not until 1952 that further Nobel Prize winners Hodgkin, Huxley and Katz [3] revealed the ionic basis of the action potential itself in a brilliant series of experiments (Figure 1), supplemented by the work of Rall [4] who provided the cable equation. Thus, the nerve impulse was considered a largely electrochemical phenomenon. It was assumed that the temporal and communicative aspects of nervous activity could be resolved by electrical theory. In other words, much has been done to understand the biophysical mechanisms underlying nerve action potentials, thanks to the excellent work by Hodgkin and Huxley and their predecessors.
Figure 1 The action potential monitored in a snail neuron at left, using a DC preamplifier and a glass microelectrode to measure voltage changes, and using the voltage clamp technique to measure current changes at right, as visualised by Hodgkin, Huxley, and Katz in 1952 (from Miles [5]) in a squid giant axon. In both cases, the dashed line represents the threshold at which an action potential is triggered. Note the prolonged after-hyperpolarisation in the Lymnaea neuron during the refractory period.
Over the last few years, it has become clear that action potentials and their forms of computation are much more complex than originally suggested and mechanical, thermal and optical changes are now known to be associated with them (e.g. Drukarch and Wilhelmus 2025 [6] and see Table 1) and should be incorporated into their description showing that action potentials have a multi-biophysical nature. Furthermore, action potentials have now been shown to have three interdependent functions of communication, modulation, and computation [7].
Table 1 Accumulating evidence for the soliton pressure pulse in neuronal membranes.

2. Communication within Nervous Systems
2.1 Neurons and Networks
Neurons are diverse and have many shapes, sizes, and functions. They may have evolved from secretory cells in the early metazoa. We can envisage that as animal size increased, the action potential evolved to control secretions at a distance [39,40]. Although many local circuit neurons in both vertebrates and invertebrates are spikeless (forming no discernible spike) [8,23,41,42,43]. The discovery of the nature of the action potential was critical to the development of modern neurophysiology, but has been modelled as a binary event in computational brain networks. Unfortunately, neurocomputation cannot work in the same way as in computers, because computers are constructed with silicon chips, which are functionally different from the soft cellular structures of the nervous system (Johnson and Winlow [9]). Furthermore, with the exception of circadian rhythms induced by the environment, there is no clock in the brain, unlike Turing machines and Turing-derived modern computers, which incorporate strict timing.
2.2 The Latency Problem and Frequency Computation
One of the conceptually important functions of the action potential is its ability to compute within the brain neural networks, each of which contains neurons of differing excitabilities and latencies, but no central timing mechanism. It is this unique ability to compute the frequency-modulated parallel inputs into meaningful logical outputs that this paper describes. We have shown that the HH action potential is inappropriate for computation [10] within nervous systems. The changes in latencies between computational points in a network where inputs are frequency modulated cannot be resolved by conventional computational modelling, and this emphasises the fundamental difference between the functioning of a physiological brain neural network and any other.
2.3 The Importance of Latency
The most fundamental element of this type of computation is the synchronisation of APPulse across a network, which must take place within the precision of calculated neural latencies in comparison to the precision of the action potential. So, the neuron's latencies become part of the computation. This is also the fundamental difference between the classical action potential of HH, the soliton described by Heimberg [11], and the action potential pulse. Action potential latencies must be measured from the spike peak and demonstrate variable latencies between successive APs. Many neurons do not form a spike, and many that do not resemble the classical action potential. By contrast, the leading edge of the ‘soliton’ is well defined with its pulse-like structure, theoretically precise to the molecular level of the ion channel opening. However, by itself, the soliton may not have the facility for computation.
We do not argue with the HH action potential and accept that some form of physical pulse/movement is inevitable from the ion movements produced. The ‘soliton’ is also the most fitting explanation for measured entropy exchanges and for calculating both the speed and latencies of action potentials. In situ, any ‘soliton’ travelling through a membrane comprised of many types of protein and lipid will inevitably lose entropy and decay without the energy provided by the HH action potential. The explanation is the combined action potential pulse APPulse, where the soliton accompanies the action potential in axons [23,24,44], cardiac muscle [45], and most probably skeletal muscle.
Assumptions of almost instantaneous activation of progressive ion channels to produce the Hodgkin-Huxley Action Potential are based upon the belief that electrostatic charge can travel from one channel to the next at the speed of the action potential. However, empirical evidence from channel spacing, ionic radii, and diffusion coefficients demonstrates this is not, by itself, the case [23,24,45]. Substantial evidence has now accrued to show that the HH action potential is accompanied by a coupled soliton pulse (Table 1), which appears to instigate channel opening as shown in Figure 2 (from Johnson and Winlow 2018 [24]). Unfortunately, the original work of Tasaki et al. [12,13] was largely ignored until the early 21st century [11,14]. The APPulse will of course require energy to maintain its progress and we suggest that this is generated by the ATP pumps that also generate the membrane potential and provide the energy for the HH action potential.
Figure 2 Instigation of channel opening by the APPulse. (1) Pressure from the accompanying pressure wave of the action potential disturbs the ion channel electrostatic seal. Attracted electrostatically charged ions pass through the channel, causing it to contract across the membrane. This, in turn, puts energy back into the pressure wave. (2) The ion channel becomes refractory when enough Na+ ions pass through to produce electrostatic equilibrium. Partially reconstructed from Johnson and Winlow [23] and McCusker et al. [37], used under Creative Commons BY-NC-SA 3.0.
3. Frequency Computation at Neuronal Convergences
The implications of frequency computation result from the theoretical mathematics of collision on the cell membrane at neuron convergences, where the outputs are the mean sampled frequencies of the collective inputs [8], and are illustrated in Figure 3. These types of neuronal convergences are ubiquitous in neurons, sometimes taking place across the soma, and also throughout the sensory systems, where many neurons converge to a ganglion. This model explains the actions and results of collisions at these convergences and explains the observed results.
Figure 3 Frequency computation between neurons. (a) illustrates the timing of the immediate opening of the ion channels, the action potential/soliton (APPulse) threshold. Nerve impulse travelling left to right. Above: the blue line represents the change in membrane potential, and the purple line represents the resting potential. Below is a model of the membrane/soliton (APPulse) linked to the potential change above. This is a cylindrical representation with the side of the membranes in red and the soliton represented by the purple sleeve. Time t represents the precision of the threshold corresponding to the timing of the opening of the ion channels sufficiently to maintain the soliton [24,35]. Refractory time begins during this threshold. (b) This is a model of frequency computation using quantum timing, showing a convergence of axons to a point of computation (POC). Quantum APPulse travels to the POC from left to right. Quanta shown in blue. The frequencies are indicated by approximate distances. The refractory period is specified by r (in reality, r is many times the threshold), and differing results of the interaction are shown in (c) and (d). As each pair of APPulses arrives at the convergence, they combine or redact according to their respective phase and whether the phase threshold encounters a refractory period of the membrane. This changes the frequency of the output such that it represents the mean sampled frequency of inputs. In (e) we show another range of frequencies based upon a different set of inputs demonstrating that output frequencies have been changed to reflect input frequencies distinctly (f) this is the definition of computation. The implication is that in a brain neural network, this computation will diffract inputs and outputs of APPulses along distinct tracts forming a three-dimensional pattern distinct to each set of parallel inputs. In terms of memory, each APPulse contains 1 trit of information per phase.
3.1 Chemical and Electrical Transmission
One of the major hurdles to overcome in our understanding of the nervous system is that of memory storage after learning. Synapses provide latency changes through the actions of neurotransmitters, resulting in an unpredictable timing for computation [8,10,24,35]. Although electrical synapses have reduced latency [10] compared with chemical synapses, both together produce a spectrum of latencies depending upon their exact construction and location. Chemical synapses, but not electrical synapses [39,42], act as a unidirectional valve preventing backpropagation across synapses and separating neurons. In terms of network theory this changes the models of conventional networks where the synapses are the nodes replacing them with the convergences of neurons in other words the neuron itself is the computational unit and not the synapses. Synapses in this model represent a slower secondary modulation of the faster, more precise frequency computation by changing interneuron latencies.
3.2 Learning and Memory Storage
Currently, the role of synaptic plasticity in learning and memory is under intense scrutiny by cognitive scientists [29,38,46] with the suggestion that memories are molecularly stored within neurons and that plastic synaptic changes in weighting occur only after learning has occurred and been stored in memory. Support for this idea comes from work on Lymnaea stagnalis [46,47] and on Aplysia [48], where it has been demonstrated that long-term memory is stored in cell bodies. Thus, plastic changes in synaptic weighting may be “a means of regulating behavior… only after learning has already occurred” [38,47]. Such assumptions would not be counter to findings that increased levels of associative learning efficiency are correlated with increased weighting of glutamatergic synapses and down-regulated by increased weighting of GABAergic inputs to neurons in the mouse barrel cortex. Another approach, and the one we have taken, is that major memory storage systems clearly reside within the brains of vertebrates and may be stored within reverberatory circuits (see below and Johnson and Winlow [9,35]).
4. Neurocomputation
4.1 The Action Potential Peak Is Not Suitable for Computational Modelling in the Brain
Action potentials are highly plastic phenomena and vary greatly in trajectory from one neuron to the next. The temporal positioning of the spike peak is very variable (Figure 4) and modifiable by synaptic inputs. Consequently, it is inappropriately used in binary computational models of neuronal activity. Here we demonstrate that only the action potential threshold taken from the beginning of the soliton has temporal constancy and therefore should be used in ternary computational models (see below). In the APPulse, whose phases are: resting potential, threshold, and the time-dependent refractory period. The refractory period is an analogue variable, formed by the recovery of the membrane, able to cancel all other action potentials in its path. Thus, the APPulse soliton threshold is the most appropriate temporal fixed point for computational modelling, not the action potential peak, and not the loosely defined action potential threshold.
Figure 4 Plasticity of action potential shape and action potential peak recorded from the soma of a fast-adapting pedal I cluster neuron (for details see [44]) in the intact brain of the mollusc Lymnaea stagnalis (L.). The cell was normally silent, and activity was initiated by a 0.2 nA current pulse of 3 s duration injected into the cell via a bridge balanced recording electrode. The same three spikes are represented in each case; (A) on a slow time base, (B) on a faster time base and (C) as a phase plane portrait in which rate of change of voltage (dV/dt) is plotted against voltage itself and the inward depolarizing phase is displayed downward maintaining the voltage clamp convention (see [43] for details of the phase plane technique). In each trace the peak of the first action potential is indicated by an orange arrow, the second action potential peak is unlabelled the third action potential peak is indicated by a green arrow. The three successive spike peaks clearly vary temporally from one another, but the threshold point of initiation remains constant, as indicated in (C) by the blue arrow in the phase plane portrait. From Winlow and Johnson, 2020 [10], licensed under Creative Commons BY-NC-SA 4.0.
4.2 Neural Transactions May Be Performed by Quantum Phase Ternary Frequency Computation
Historically, the action potential has been viewed as being a binary event, and binary mathematics has been assumed to mediate computation in the brain. However, computation in a network may occur in several ways and a binary notation is not exclusive, logic may use other base forms and timings.
4.2.1 The Timing of the Spike is Directly Related to Threshold
Superficial calculations on timing, facility of computation, and error demonstrate that only a few current models apply to vertebrate brains or those of advanced invertebrates, and almost all can be immediately discounted [9,10,44,46]. In realistic neural networks and in the brain, the only element of the action potential that is responsible for its live propagation is whatever mechanism causes the threshold at the leading edge. Thus, the threshold alone is the initiator of the action potential (Figure 5) and its temporal marker. The timing of the spike is therefore directly related only to the threshold, which is a quantum event. Additionally, the threshold may be better defined temporally so that it is not a rising potential difference but a direct change over time. The rest of the action potential is only concerned with the refractory period and stabilisation to resting potential and is irrelevant to the speed of transmission. Of course, the refractory period can affect the frequency of transmission [10] and affect computation by backpropagation. In computational terms, this means that computation is a phase ternary event comprised of:
- Resting potential of indeterminate length, actively maintained by membrane pumps.
- Quantum threshold - a temporal fixed point, unlike the plastic action potential peak or the action potential threshold. Above this threshold, a quantum of information, powered by the ATP pumps, passes through the neuron to the next synapse, with the HH spike arriving rather later as part of the refractory process.
- Refractory phase, when no new action potentials can occur, and is an analogue variable, unlike the other two phases. It can reroute action potentials along different pathways at bifurcations (see Johnson and Winlow [10]).
Figure 5 (A) The nerve impulse appears to be an ensemble of three inseparable components: the physiological action potential, as described by Hodgkin, Huxley, and Katz [3] (1952), the action potential pulse (Johnson and Winlow, 2018) [24], and the computational action potential (From Winlow and Johnson 2021 [49], with permission). (B) Illustration of Input and Output frequencies showing circuit memory looping. Each APPulse contains a trit (ternary memory) of information coded in phase (blue). Collision of input frequencies from a and b form a distinct phase change in APPulses at point c, changing frequencies of the output over time according to input phases. Phase is changed by additional input APPulse collisions, which combine or redact according to the rules of phase collisions Figure 3, resulting in appropriate changing frequencies of APPulses within the memory circuit (red) and output. APPulses pass directly to output via neuron f and simultaneously proceed around the loop. The patterns of repeating APPulses circulating in the loop form circuit memory when successive APPulses collide with both further inputs from a and b, and the result from f. Therefore, the phase change of looping APPulses represents not just the immediate result from a and b, but memory of previous inputs coded into large or small changes in the phase of the threshold. Each APPulse in the loop that has its phase changed by a value more than the threshold has stored 1 trit (ternary memory) of information. The phase of the circuit logically synchronises the outputs according to the input and the change in recorded memory caused by the change in frequencies held in the (red) loop.
The threshold is produced by the opening of sodium gates by the soliton and is the rate limiter of a neuron (Figure 4). If we are correct, this effectively represents the ‘gating’ mechanism in a conventional computer. If the threshold of the soliton has insufficient entropy to open the ion channels, propagation fails. In the APPulse model, both the soliton and the action potential cancel on collision, with the timing and latency being controlled by the ‘soliton’. However, no spike is required for computation with the APPulse, so it can occur even in spikeless neurons. The refractory period is only relevant when action potentials collide in phase-ternary computation (PTC), which is accurate to microseconds, i.e., it is both fast and efficient [10,24,35]. Elsewhere, we have described how this may occur in the vertebrate retina [8,24,35]. Furthermore, where electrical synapses occur, the APPulse will pass unhindered from one neuron to the next. Electrical synapses ensure spike synchronization and reliable transmission, which influences bursting patterns and firing frequency [50]. However, at a chemical synapse, a neurotransmitter will require more time for the pulse to pass. Glial cells such as astrocytes are perfectly positioned to provide mechanical continuity [35], and doubtless there will be equivalent glial cells in invertebrates.
4.3 Neurocomputation Underlying Perception and Sentience in Neocortex
As demonstrated above, the physiological action potential is not sufficiently precise for computation. Still, the APPulse is much more precise as it does not rely on the action potential peak but on the precise timing of the opening of the ion channels that provides entropy for the threshold of the soliton: the quantum threshold. It should be noted that the APPulse is not binary, but a compound digital ternary object with an analogue third phase. The electro-physiologically recorded action potential has three phases: (i) the resting potential, (ii) the spike, whose peak is frequency dependent and highly plastic (Figure 4), (iii) the refractory period - a time-dependent analogue variable.
From what we have set out above, it should be clear that the basis for computation in the brain is the quantum threshold of the ‘soliton’, which accompanies the ion changes of the action potential, and the refractory membrane at convergences. Knowing that, we have been able to provide a logical explanation from the action potential to a neuronal model of the coding and computation of the retina [8]. We have shown elsewhere that this model automatically redacts error [23,24,35]. This is particularly important in the parallel connections of the neocortex, where memory is encoded in reverberatory loops of APPulse [35,49]. However, unlike Turing-based conventional computers and artificial intelligence (AI), there is no functioning clock in the nervous system, and computation occurs by relative timing of the concurrent frequencies of quantum pulses across parallel inputs. We showed in the retina that timing from one point to another depends upon transmission speed and the length of the neuron, producing fixed latencies, which are both variable from one neuron to another. In this context, therefore, the quantum begins at the threshold of the soliton pulse and ends with the refractory period.
4.3.1 Sensory Inputs to the CNS
We have explained elsewhere how the visual cortex is likely to operate by quantum phase processing [9,35,49]. We know vision is both robust and error-free, with the eye able to adapt during macular degeneration, and this physiology is best explained using the computational action potential model [8,24,35]. This is most likely to be true of hearing, where we have yet to apply this model, and other sensory systems that have similar connectivity, converging neuronal structures, and an appropriate cortical neural network. The cortex of the brain structures resemble small world networks, such as those described elsewhere [51], where connectivity is facilitated through multiple connections and each connection is temporally proximal to all other connections such that timing of computation is performed rapidly. In such a system, parallel frequencies of APPulses are able to collide into definable patterns, creating distinct object representations [8]. Elsewhere in the eye and auditory systems, we have shown how many sensory cells are mapped to single neurons and that convergences of neurons are common.
4.3.2 Diffraction of Information Across Neural Networks
We have also demonstrated, using the threshold and refractory period of a quantum phase pulse, that action potentials diffract across a randomly formed neural network, such as the cortex, due to the annulment of parallel collisions in phase ternary computation (PTC). Thus, PTC applied to neuron convergences computes the collective mean sampled frequency and is the only mathematical solution within the constraints of brain neural networks (BNN) [8]. For example, two neurons converging with APPulse at 40 Hz will output 40 Hz and not 80 Hz. In the retina and other sensory areas, we have discussed how this information is initially coded and then understood in terms of network abstracts within the lateral geniculate nucleus (LGN) and visual cortex, and critically how it is coded.
By defined neural patterning within a neural network and by adjusting frequencies we have been able to theorise that abstraction occurs similarly to a contextual network where objects and thoughts are contextually compared, and memory stored geographically in the network. This idea will be familiar to network computer scientist and observers of brain activity [35]. The output of frequencies from the visual cortex represents information amounting to abstract representations of objects in increasing detail. The evidence of nerve tracts, from the LGN, can be extracted from this, and we propose that these loops provide time synchronisation to the neocortex for the most basic representations. The full image is therefore combined in the neocortex with other sensory modalities, so that it receives concurrent information about the object from the eye, and all the abstracts that make up the object. Spatial patterns in the visual cortex are formed from individual patterns illuminating the retina, and reverberatory loops of APPulse encode memory. We believe that a similar process of PTC may take place in the cochlea and associated ganglia, as well as ascending information from the spinal cord, and that this function should be considered universal where convergences of neurons occur.
4.3.3 Neural Convergences and Accuracy of Computation
Our findings are summarised in Figure 5. It should be noted that the advantage of using neuron convergences is that near static axon latencies give an accuracy of computation to the molecular level of the membrane between 10-6 and theoretically 10-9 s. For example: if each quantum threshold takes a maximum of 10-6 s, up to 106 parallel neuron inputs can be distinguished from each other in 10-6 s across nodes (convergences and divergences) in each computation; and over two in line computations theoretically can accurately compute 1012 nodes. In practice neurons typically have 1000 direct connections or less connected as a small world network (everything connected in less than 3 moves) where neurons are closely connected. This computational power is error-free, and because of almost unlimited inputs, phase ternary logic has no limit on extension. For example, at the lowest precision of 10-6 s with 1000 connections, as in the retina, it gives an equivalent clock speed for computation of 10-10 s in addition to being able to parallel process across 106 neuron convergences simultaneously. This system may explain the existence of known loops within the central nervous system [52], and may retain memory with the same mechanism.
In the above example the memory loop in the network holds the short-term circuit memory of the mean sampled changes of neuron a and b flowing from i and ii to the circulating APPulse memory bank path (red) iii, v, vi … iv, iii clockwise (neurotransmitter synapses act as single direction diodes of fixed latencies. Synapse b - c is an electrical synapse and is thus bidirectional and able to communicate by backpropagation to b). Any recent changed frequency activity in a and b, will affect looping APPulse in path iii, v, vi … iv, iii, should they collide, maintaining the memory of recent sampled frequencies. Path f contains fast information from c synchronising memory in the loop. The output therefore contains synchronised information of frequency change mean sampling a and b with information containing previous frequencies. Memory can therefore be extracted from this system and passed to further computational elements. Larger deeper networks with many more connections will maintain memory in synchronicity over a longer period of time and with greater complexity and synchronicity as this system is infinitely extendable.
We propose that this mechanism is elemental memory held in the neural networks and that synapses modulate the functioning of the brain over the longer term.
5. Discussion
5.1 Computation by the APPulse
As we have seen, the nerve impulse is not just about the activation of muscles in a reflex loop. The fundamental question remaining for neurophysiology is how does the action potential compute and it is this that provides the difference between the Hodgkin Huxley action potential, the soliton and the APPulse. In this paper we have described and discussed a possible mechanism for computation in the brain with evidence ranging from an explanation of micro molecular events to the macro explanation of logic computer operations in the brain neural network. We have previously discussed how other models use binary coding and are based on conventional computing [9,53]. Only the APPulse offers an explanation of how computation in the brain occurs without absolute timing, in a random plastic environment of neurons, and with the timing and accuracy. We therefore propose that computation using synapses is critically slower and more concerned with neuromodulation. We have previously suggested that in the retina [8], synapse computation controls the light and dark reflexes. However, synapses are unable to explain the timing and finite error management of the retina, but this can be explained using frequency computation of the quantum APPulse. Computation is especially relevant when considering a brain neural network where many thousands of inputs converge and must be distinguished from one another. In this paper, we have discussed temporal computation where frequencies of the APPulse are timed relatively and where the quantum threshold precision approaches the molecular speed of ion gate opening.
5.2 Computation Through a Parallel Network
Computation through a parallel network is not absolutely time dependent but is relatively time dependent upon the frequency changes of the signal quantum, error free at each stage, the only delay is neuron latency. We argue that the speed of ‘gating’ quantum phase computation is dependent upon the physical opening of the ion channels in the membrane. In a parallel network, millions of inputs can be calculated concurrently. Thus, there is no common element that exists between the evidence of nerve transmission and current conventional computational models, or indeed in computational science. Currently, the importance of error management cannot be understated as the APPulse model permits lossless signaling of quantal information along parallel neurons.
5.3 There Is No Clock in the Nervous System
Until now, frequency modulated computation has not been considered by computer scientists and has followed conventional, almost exclusively binary and ‘clocked’ computation. Their assumption that commercial success defines the workings of the brain is therefore unsupportable. We propose that the APPulse and quantum frequency computation is the most appropriate neural mechanism for computational physiology of nervous communication and provides far greater efficiency and effectiveness for computation as it eradicates error that exists within in silico and other Turing based systems.
6. Summary
- In physiology, it is clear from the evidence that HH cannot account for the precision needed to facilitate parallel computation in the brain. The flawed assumption that the brain is a Turing machine is based on accurate timing and synchronizing the whole neural network with a centralized clock, but it has no basis. Using the APPulse, a central clock is unnecessary because computational events synchronize during frequency computation of the quantum APPulse across parallel threads.
- The quantum element of the APPulse takes the form of the mechanical opening of the ion-gates at the point of threshold by the pulse and is essential for synchronised parallel computation.
- The HH action potential is therefore an essential mechanism for entropy exchange, enabling the APPulse to continuously propagate along the membrane with the ion channels opening on mechanical stimulus from the leading edge of the soliton: this is the threshold.
- Many simultaneous parallel inputs of quantum APPulses may act together and create output patterns according to their respective frequencies of APPulses. As these patterns spread throughout a brain neural network, we propose that they interact and disperse across the matrix according to these changing frequencies (Figure 5). It is these patterns in the brain that form our perception. We have termed the circuit memories in the memory loops as elemental memory.
- Timing in the brain must account for the precision and activity of the neural networks and the ability to react to fast changes in stimulus and encompass learning. This functionality cannot be replicated by the action potential alone, as it does not have the sufficient precision or timing necessary. Furthermore, no method of achieving this has been discovered within the brain. However, the computational action potential formed from the APPulse soliton threshold acting as a frequency-modulated computer can be shown to build on the work of HH to have this ability and to diffract and form the necessary memory and activity observed in the brain in concert with the refractory period observed in the HH action potential. In other words, the HH action potential cannot act alone, and computation must be facilitated by the leading edge of the soliton.
Glossary

Author Contributions
Conceptualization was initiated by WW and developed by WW and ASJ. Writing the article and developing the table and figures was a combined effort of WW and ASJ.
Competing Interests
The authors have declared that no competing interests exist.
References
- Nobel Prize Outreach. Sir Charles Sherrington [Internet]. Nobel Prize Outreach; 1932. Available from: https://www.nobelprize.org/prizes/medicine/1932/sherrington/facts/.
- Nobel Prize Outreach. Edgar Adrian [Internet]. Nobel Prize Outreach; 1932. Available from: https://www.nobelprize.org/prizes/medicine/1932/adrian/facts/.
- Hodgkin AL, Huxley AF, Katz B. Measurement of current-voltage relations in the membrane of the giant axon of Loligo. J Physiol. 1952; 116: 424-448. [CrossRef] [Google scholar]
- Rall W. Core conductor theory and cable properties of neurons. Compr Physiol. 1977; 1977: 39-97. [CrossRef] [Google scholar]
- Miles FA. Excitable cells. London, UK: William Heinemann Medical Books Ltd.; 1969. [Google scholar]
- Drukarch B, Wilhelmus MM. Understanding the scope of the contemporary controversy about the physical nature and modeling of the action potential: Insights from history and philosophy of (neuro) Science. OBM Neurobiol. 2025; 9: 269. [CrossRef] [Google scholar]
- Winlow W, Johnson AS. Nerve impulses have three interdependent functions: Communication, modulation, and computation. Bioelectricity. 2021; 3: 161-170. [CrossRef] [Google scholar]
- Johnson AS, Winlow W. Are neural transactions in the retina performed by phase ternary computation? Ann Behav Neurosci. 2019; 2: 223-236. [CrossRef] [Google scholar]
- Johnson AS, Winlow W. Shortcomings of current artificial nodal neural network models. EC Neurol. 2017; 4: 198-200. [Google scholar]
- Winlow W, Johnson AS. The action potential peak is not suitable for computational modelling and coding in the brain. EC Neurol. 2020; 12: 46-48. [Google scholar]
- Heimburg T, Jackson AD. On soliton propagation in biomembranes and nerves. Proc Natl Acad Sci. 2005; 102: 9790-9795. [CrossRef] [Google scholar]
- Tasaki I. A macromolecular approach to excitation phenomena: Mechanical and thermal changes in nerve during excitation. Physiol Chem Phys Med NMR. 1988; 20: 251-268. [Google scholar]
- Tasaki I, Kusano K, Byrne P. Rapid mechanical and thermal changes in the garfish olfactory nerve associated with a propagated impulse. Biophys J. 1989; 55: 1033-1040. [CrossRef] [Google scholar]
- El Hady A, Machta BB. Mechanical surface waves accompany action potential propagation. Nat Commun. 2015; 6: 6697. [CrossRef] [Google scholar]
- Batabyal S, Satpathy S, Bui L, Kim YT, Mohanty S, Bachoo R, et al. Label-free optical detection of action potential in mammalian neurons. Biomed Opt Express. 2017; 8: 3700-3713. [CrossRef] [Google scholar]
- Ling T, Boyle KC, Goetz G, Zhou P, Quan Y, Alfonso FS, et al. Full-field interferometric imaging of propagating action potentials. Light Sci Appl. 2018; 7: 107. [CrossRef] [Google scholar]
- Perez-Camacho MI, Ruiz-Suárez JC. Propagation of a thermo-mechanical perturbation on a lipid membrane. Soft Matter. 2017; 13: 6555-6561. [CrossRef] [Google scholar]
- Mussel M, Schneider MF. It sounds like an action potential: Unification of electrical, chemical and mechanical aspects of acoustic pulses in lipids. J R Soc Interface. 2019; 16: 20180743. [CrossRef] [Google scholar]
- Shrivastava S, Kang KH, Schneider MF. Collision and annihilation of nonlinear sound waves and action potentials in interfaces. J R Soc Interface. 2018; 15: 20170803. [CrossRef] [Google scholar]
- Barz H, Schreiber A, Barz U. Nerve impulse propagation: Mechanical wave model and HH model. Med Hypotheses. 2020; 137: 109540. [CrossRef] [Google scholar]
- Sattigeri RM. Action potential: A vortex Phenomena; driving membrane oscillations. Front Comput Neurosci. 2020; 14: 21. [CrossRef] [Google scholar]
- Holden AV, Yoda M. Ionic channel density of excitable membranes can act a bifurcation parameter. Biol Cybern. 1981; 42: 29-38. [CrossRef] [Google scholar]
- Johnson AS. The coupled action potential pulse (APPulse)-neural network efficiency from a synchronised oscillating lipid pulse Hodgkin Huxley action potential. EC Neurol. 2015; 2: 94-101. [Google scholar]
- Johnson AS, Winlow W. The soliton and the action potential-primary elements underlying sentience. Front Physiol. 2018; 9: 779. [CrossRef] [Google scholar]
- Marban E, Yamagishi T, Tomaselli GF. Structure and function of voltage-gated sodium channels. J Physiol. 1998; 508: 647-657. [CrossRef] [Google scholar]
- Catterall WA. Voltage-gated sodium channels at 60: Structure, function and pathophysiology. J Physiol. 2012; 590: 2577-2589. [CrossRef] [Google scholar]
- Zhang XC, Liu Z, Li J. From membrane tension to channel gating: A principal energy transfer mechanism for mechanosensitive channels. Protein Sci. 2016; 25: 1954-1964. [CrossRef] [Google scholar]
- Martinac B. Mechanosensitive ion channels: An evolutionary and scientific tour de force in mechanobiology. Channels. 2012; 6: 211-213. [CrossRef] [Google scholar]
- Takahashi N, Oertner TG, Hegemann P, Larkum ME. Active cortical dendrites modulate perception. Science. 2016; 354: 1587-1590. [CrossRef] [Google scholar]
- Sukharev S, Anishkin A. Mechanosensitive channels: History, diversity, and mechanisms. Biochem Moscow Suppl Ser A. 2022; 16: 291-310. [CrossRef] [Google scholar]
- Ritchie JM, Keynes RD. The production and absorption of heat associated with electrical activity in nerve and electric organ. Q Rev Biophys. 1985; 18: 451-476. [CrossRef] [Google scholar]
- Howarth JV, Keynes RD, Ritchie JM, von Muralt A. The heat production associated with the passage of a single impulse in pike olfactory nerve fibres. J Physiol. 1975; 249: 349-368. [CrossRef] [Google scholar]
- Moujahid A, D'Anjou A, Graña M. Energy demands of diverse spiking cells from the neocortex, hippocampus, and thalamus. Front Comput Neurosci. 2014; 8: 41. [CrossRef] [Google scholar]
- Peets T, Tamm K, Engelbrecht J. On mathematical modeling of the propagation of a wave ensemble within an individual axon. Front Cell Neurosci. 2023; 17: 1222785. [CrossRef] [Google scholar]
- Johnson AS, Winlow W. The nature of quantum parallel processing and its implications for coding in brain neural networks: A novel computational mechanism. Front Netw Physiol. 2025; 5: 1632144. [CrossRef] [Google scholar]
- Galinsky VL, Frank LR. The wave nature of the action potential. Front Cell Neurosci. 2025; 19: 1467466. [CrossRef] [Google scholar]
- McCusker EC, Bagnéris C, Naylor CE, Cole AR, D'avanzo N, Nichols CG, et al. Structure of a bacterial voltage-gated sodium channel pore reveals mechanisms of opening and closing. Nat Commun. 2012; 3: 1102. [CrossRef] [Google scholar]
- Trettenbrein PC. The demise of the synapse as the locus of memory: A looming paradigm shift? Front Syst Neurosci. 2016; 10: 88. [CrossRef] [Google scholar]
- Grundfest H. Current state of physiology and pharmacology of synapses and certain conclusions for the mammalian central nervous system. Cesk Fysiol. 1959; 8: 484-504. [Google scholar]
- Winlow W. Neuronal communications. Manchester, UK: Manchester University Press; 1990. [Google scholar]
- Dowling JE. The retina: An approachable part of the brain. Cambridge, MA: Harvard University Press; 1987. [Google scholar]
- Shepherd GM. Axons, dendrites and synapses. Fed Proc. 1975; 34: 1395-1397. [CrossRef] [Google scholar]
- Roberts A, Bush BMH. Neurons without Impulses. Neurones without impulses: Their significance for vertebrate and invertebrate nervous systems (society for experimental biology seminar series, series number 6). Cambridge, MA: Cambridge University Press; 1981. [Google scholar]
- Johnson AS, Winlow W. Neurocomputational mechanisms underlying perception and sentience in the neocortex. Front Comput Neurosci. 2024; 18: 1335739. [CrossRef] [Google scholar]
- Johnson AS. The coupled cardiac action potential pulse (CAPpulse)-synchronised oscillating mechanical pulse cardiac action potential. EC Neurol. 2016; 3: 520-530. [Google scholar]
- Scheibenstock A, Krygier D, Haque Z, Syed N, Lukowiak K. The Soma of RPeD1 must be present for long-term memory formation of associative learning in Lymnaea. J Neurophysiol. 2002; 88: 1584-1591. [CrossRef] [Google scholar]
- Sangha S, Varshney N, Fras M, Smyth K, Rosenegger D, Parvez K, et al. Memory, reconsolidation and extinction in Lymnaea require the soma of RPeD1. In: Post-genomic perspectives in modeling and control of breathing. Boston, MA: Springer; 2004. pp. 311-318. [CrossRef] [Google scholar]
- Chen S, Cai D, Pearce K, Sun PY, Roberts AC, Glanzman DL. Reinstatement of long-term memory following erasure of its behavioral and synaptic expression in Aplysia. Elife. 2014; 3: e03896. [CrossRef] [Google scholar]
- Johnson AS, Winlow W. Does the brain function as a quantum phase computer using phase ternary computation? Front Physiol. 2021; 12: 572041. [CrossRef] [Google scholar]
- Beekharry CC, Zhu GZ, Magoski NS. Role for electrical synapses in shaping the output of coupled peptidergic neurons from Lymnaea. Brain Res. 2015; 1603: 8-21. [CrossRef] [Google scholar]
- Bullmore E, Sporns O. Complex brain networks: Graph theoretical analysis of structural and functional systems. Nat Rev Neurosci. 2009; 10: 186-198. [CrossRef] [Google scholar]
- Shepherd GMG, Yamawaki N. Untangling the cortico-thalamo-cortical loop: Cellular pieces of a knotty circuit puzzle. Nat Rev Neurosci. 2021; 22: 389-406. [CrossRef] [Google scholar]
- Furber SB, Galluppi F, Temple S, Plana LA. The spinnaker project. Proc IEEE. 2014; 102: 652-665. [CrossRef] [Google scholar]







