Advantages of Omega Server neural network over classical neural networks
We created the Omega Server platform where any individual user or organization can experiment with the PANN neural network using their datasets; verify the unique qualities of the neural network, and solve their practical problems.

Feature | Competing classical neural networks* (Google, IBM, Miscrosoft) | PANN© neural networks |
---|---|---|
Required number of network training epochs | 1,000,000 | 10 |
Training speed (7 000 images) | 12,000 s. (3.3 hours) | 4 s. |
Full task parallelization during network training | Not possible | Yes |
Online up-training of an already trained network and removal of unnecessary data | Not possible. Requires re-training | Yes |
Network size (amount of inputs and outputs) | Limited. Exponential growth of training time | Unlimited. Processing of images of any size |
Increasing network size during computation | Not possible | Yes. Networks can be expanded to any size |
IBM SPSS Statistics 22
Competing classical neural networks (Google, IBM, Miscrosoft)
Tests
Browse a summary of the tests carried out on PANN and classical neural networks.
GitHub
As you know, neural network developers use various databases to test their networks. The instructions and data for testing Omega Server with IRIS and MNiST databases are available on GitHub.
Geoffrey Hinton has figuratively described MNIST as the ‘Drosophila of machine learning,’ meaning that it allows machine learning researchers to study their algorithms under controlled laboratory conditions, similar to how biologists often study fruit flies.
IRIS: https://github.com/Omega-Server/p-net-comparison-iris
MNIST: https://github.com/Omega-Server/p-net-comparison