Consider the binary computer. All bits have one of two states either 0 or 1, symbolic for ‘off’ and ‘on’ with reference to a circuit as symbolic of a ‘fact of the world’ propositionally, A is true or false. In a quantum computer, the states may be occupied in superposition with a probability distribution such that the quantum-state is “a<1> + (1-a)<0>” where ‘a’ is a real positive numbers less than 1 signifying the probability of state-1 occurring, ‘turning on.’ The quantum binary computer, at least locally, ultimately collapses into a common electro-magnetic binary computer when the true value of the bit is measured as either 1 or 0, yet before then is suspended in the super-positional quantum state. Thus, the resultant value of measurement is the answer to a particular question, while the quantum-state is the intermediation of problem-solving. The problem is inputted for initial conditions of the quantum-formulation as the distributions of the different informational bits (i.e. ‘a’ values). Altogether this is the modern formulation of randomness introduced to a primitive Abacus, for which beads might be slid varying lengths between either edge to represent the constant values of each informational bar; it is dropped (on a sacred object) to introduce random interaction between the parts; and the resulting answer is decoded by whether a bead is closer to one-side (1) or the other (0). Random interaction is allowed between informational elements through relational connections so that the system can interact with itself in totality, representing the underlying distributional assumptions of the problem.

If a quantum computer is to be used to find the overall objective truth to a set of seemingly valid propositional statements, each claim can be inputted as a single quantum-bit with the perceived probability of its correctness, i.e. validity metric, and the inter-relational system of claims inputted through oriented connections between bit-elements. After the quantum simulation, it will reach a steady-state with each bit either being true (1) or false (0), allowing the resulting valid system, considering the uncertainty and relationship between all the facts, to be determined. In a particularly chaotic system, the steady-state may itself exhibit uncertainty, as when there are many equally good solutions to a problem, with therefore repeated sampling of the system giving varying results. The problem is thus formulated as a directed ‘network’ deep-graph and initialized as nodes, edge-lengths, and orientations. The random interaction of the system operates as partial-differential relations (directed edges) between the – here, binary – random variables (nodes). The quantum computer therefore naturally solves problems formulated under the calculus class of Partial Differential Equations for for the Stochastic Processes. The quasi-state nodes interact through pre-determined relations (assumptions) to reach an equilibria for the total automata as the state-of-affairs.

We may therefore consider a quantum voting machine to solve normative problems of policy and representation. Each person submits not just a single vote (0 or 1), but a quantum-bit as a single subjective estimation of the validity of the claim-at-hand. The basic democratic assumption to voting as a solution to the normative problem is that all votes are equal, so each single q-vote is connected to the central state-node with a proportional flow. The final state-solution will be a q-bit with probability equal to the average of all the q-votes, which may be close to the *extremas* (true or false), yet may also be close to the center (non-decidability). A *measured* decision by the state will thus result from *not-collapsing* this random variable with all its information, especially if the probability is close to ½, and therefore leaving the policy’s value undecidable, although rules for further collapsing the distribution (i.e. passing the law only if majority for *popular-vote*) can be established before-hand. It is also possible to create a more complicated method of aggregation, rather than total-connection, as with the *districting *for the* electoral college*, by grouping certain votes and connecting these groupings in relations to the whole state-decision through concentric meta-groupings. We may further complicate the model of quantum-voting by allowing each citizen to submit, not just a single q-vote, but a whole quantum system of validity statements instead of just one bit to answer the question, such as the rationality for a single normative answer expressed through a hierarchically weighted validity tree of minor claims. In fact, if more information is inputted by each citizen (i.e. validity arguments), then the naturally representative systems of groupings between persons can be determined by, rather than prior, to the vote. We end up with, not a ‘yes’ or ‘no’ on a single proposition, but an entire representational system for a regulatory domain of social life – the structural-functional pre-image (i.e. ‘photonegative’).

Quantum computing is a natural process of biological life. One can construct a DNA-computer with quantum properties through encoding the initial-conditions as the frequency-distributions of certain gene-patterns (built from the 4 base-pairs) and the computers’ dynamic systems of interaction through the osmosis of an enzyme mixture, resulting in a distributional answer to a complex problem. More generally within evolutionary theory, the environment acts as a macro quantum computer through random mutations and natural selection to create and select for the best gene variations (polymorphisms) for a species’ survival.