Nowadays it is difficult to find a problem that has not yet been proposed to be solved by neural networks. And in many problems other methods are no longer even considered. In such a situation, it is logical that in pursuit of the βsilver bulletβ, researchers and technologists offer more and more new modifications of neural network architectures, which should bring the applicants βhappiness for all, for nothing, and let no one go offended!β However, in industrial tasks it often turns out that the accuracy of the model mainly depends on the cleanliness, size and structure of the training sample, and the interface needs to be reasonable on a neural network model (for example, it is unpleasant when the logic answer should be a variable-length list).
Another thing is productivity, speed. Here the dependence on architecture is direct and quite predictable. However, not all scientists are interested. It is much more pleasant to think for centuries, epochs, to mentally aim for a century when magically the computing power will be unimaginable, and energy is extracted from the air. However, there are also enough mundane people. And it is important for them that the neural networks are more compact, faster and more energy-efficient right now. For example, this is important when working on mobile devices and in embedded systems where there is no powerful video card or you need to save battery. A lot has been done in this direction: here are small-sized integer neural networks, and the removal of excess neurons, and tensor decomposition decompositions, and much more.
We managed to remove the multiplications from the calculations inside the neuron, replacing them with additions and taking the maximum, although we left the opportunity to use multiplications and nonlinear operations in the activation function. We called the proposed model a bipolar morphological model of a neuron.

, , β β - . , , . . , . , , , .
, , , , . , , . . . Labor omnia vΔ«cit improbus et dΕ«rΔ«s urgΔns in rΔbus egestΔs.
, β . 90- [1, 2]. . , [3], [4]. , , . [5], [6]. , . .
, , , , .
:
, , .
(-), . , . , 4 , :
. :
:
β , β . , , . , , (), β (). . , β , . , , β , . - .
- . 1. ReLU 4 : . . , .
, , . , , . (, -), β .

. 1. .
, - :
, . , . , , .
, , , : - ! , . ( 1) ( 2). ? , . , .
, -: - , , -, , . - : , , , .
, , , incremental learning β , . . - , . ββ β ( 1), β ( 2). ββ , , . , -, , , -.
MNIST
MNIST β , 60000 28 28. 10000 . 10% , β . . 2.

. 2. MNIST.
:
conv(n, w_x, w_y) β n w_x w_y;
fc(n) β n ;
maxpool(w_x, w_y) β max-pooling w_x w_y;
dropout(p) β dropout p;
relu β ;
softmax β softmax.
MNIST :
CNN1: conv1(30, 5, 5) β relu1 β dropout1(0,2) β fc1(10) β softmax1.
CNN2: conv1(40, 5, 5) β relu1 β maxpool1(2, 2) β conv2(40, 5, 5) β relu2 β fc1(200) β relu3 β dropout1(0,3) β fc2(10) β softmax1.
. 1. ββ . () ().
1. MNIST. β , β .
-, , - . , - , . , .
: . , . : - .
MRZ
MRZ- , (. . 3). 280 000 21 17 37 MRZ, .

. 3. MRZ .
CNN3: conv1(8, 3, 3) β relu1 β conv2(30, 5, 5) β relu2 β conv3(30, 5, 5) β relu3 β dropout1(0,25) β fc1(37) β softmax1.
CNN4: conv1(8, 3, 3) β relu1 β conv2(8, 5, 5) β relu2 β conv3(8, 3, 3) β relu3 β dropout1(0,25) β conv4(12, 5, 5) β relu4 β conv5(12, 3, 3) β relu5 β conv6(12, 1, 1) β relu6 β fc1(37) β softmax1.
2. ββ . () ().
, MNIST: -, , . - , - .
2. MRZ. β , β .
, , . , - . MNIST MRZ.
? , - . , (, ) . , β TPU, .
, , : , .
PS. ICMV 2019:
E. Limonova, D. Matveev, D. Nikolaev and V. V. Arlazarov, βBipolar morphological neural networks: convolution without multiplication,β ICMV 2019, 11433 ed., Wolfgang Osten, Dmitry Nikolaev, Jianhong Zhou, Ed., SPIE, Jan. 2020, vol. 11433, ISSN 0277-786X, ISBN 978-15-10636-43-9, vol. 11433, 11433 3J, pp. 1-8, 2020, DOI: 10.1117/12.2559299.
- G. X. Ritter and P. Sussner, βAn introduction to morphological neural networks,β Proceedings of 13th International Conference on Pattern Recognition 4, 709β717 vol.4 (1996).
- P. Sussner and E. L. Esmi, Constructive Morphological Neural Networks: Some Theoretical Aspects and Experimental Results in Classification, 123β144, Springer Berlin Heidelberg, Berlin, Heidelberg (2009).
- G. X. Ritter, L. Iancu, and G. Urcid, βMorphological perceptrons with dendritic structure,β in The 12th IEEE International Conference on Fuzzy Systems, 2003. FUZZ β03., 2, 1296β1301 vol.2 (May 2003).
- G. X. Ritter and G. Urcid, βLattice algebra approach to single-neuron computation,β IEEE Transactions on Neural Networks 14, 282β295 (March 2003).
- H. Sossa and E. Guevara, βEfficient training for dendrite morphological neural networks,β Neurocomputing 131, 132β142 (05 2014).
- E. Zamora and H. Sossa, βDendrite morphological neurons trained by stochastic gradient descent,β in 2016 IEEE Symposium Series on Computational Intelligence (SSCI), 1β8 (Dec 2016).