Sujet : Will we ever have Real Quantum Neurons? (Re: neural networks cover rule based in zero order logic)
De : janburse (at) *nospam* fastmail.fm (Mild Shock)
Groupes : comp.lang.prologDate : 15. Mar 2025, 17:04:08
Autres entêtes
Message-ID : <vr48dn$1gr8q$1@solani.org>
References : 1 2
User-Agent : Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 SeaMonkey/2.53.20
Hi,
There are some ideas to realize the neuronal neuron used
for belief networks on the computer. Via so called
“Repeat-Until-Success” (RUS) circuits maybe?
See also:
Towards a Real Quantum Neuron
Wei Hu - 2018
https://www.scirp.org/journal/paperinformation?paperid=83091Quantum Neuron
Yudong Cao et al. - 2017
https://arxiv.org/abs/1711.11240Bye
Mild Shock schrieb:
A storm of symbolic differentiation libraries
was posted. But what can these Prolog code
fossils do?
Does one of these libraries support Python symbolic
Pieceweise ? For example one can define rectified
linear unit (ReLU) with it:
/ x x >= 0
ReLU(x) := <
\ 0 otherwise
With the above one can already translate a
propositional logic program, that uses negation
as failure, into a neural network:
NOT \+ p 1 - x
AND p1, ..., pn ReLU(x1 + ... + xn - (n-1))
OR p1; ...; pn 1 - ReLU(-x1 - .. - xn + 1)
For clauses just use Clark Completion, it makes
the defined predicate a new neuron, dependent on
other predicate neurons,
through a network of intermediate neurons. Because
of the constant shift in AND and OR, the neurons
will have a bias b.
So rule based in zero order logic is a subset
of neural network.
Python symbolic Pieceweise
https://how-to-data.org/how-to-write-a-piecewise-defined-function-in-python-using-sympy/ rectified linear unit (ReLU)
https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
Clark Completion
https://www.cs.utexas.edu/~vl/teaching/lbai/completion.pdf