Systemic shortcomings of classical AI

Inherent defect of neural networks

The development paradigm of existing neural networks was initially erroneous. It is this innate error that is the main reason for the slowdown in the development of artificial intelligence and its extensiveness. We call this the Rosenblatt Fallacy – an erroneous postulate describing the design of a formal neuron as being limited to a single weight per synapse, which was in line with biologists’ long outdated notions.

Systemic shortcomings of classical AI

Intrinsic limitations

This error in the original structure of neural networks gives rise to inevitable limitations that are becoming ever more tangible and ever more critical. With the increase in the volume of data and the complexity of the task, the need for processing power of neural networks on ordinary neurons increases exponentially. Together with them, training time, energy consumption, the cost of human resources and other costs grow exponentially. The cost of training neural networks is becoming prohibitive.

Systemic shortcomings of classical AI

Consequences

This inherent defect and these intrinsic limitations have led to a slowdown in the field of artificial intelligence. There are several reasons for this:

  1. Access monopolization:
    1. Requires powerful computing hardware and expensive infrastructure.
    2. Accessible only to large companies and giant monopolies.
    3. Chronic shortage of computing power with exponentially growing needs.
    4. Processing data at third party facilities carries the risk of data confidentiality breaches.
  2. High costs:
    1. Enormous energy and resource intensity and gigantic emissions.
    2. Difficulty in integration, use and support.
    3. Vulnerability and low reliability: failures, errors, external influences.
    4. Process opacity: neural network training occurs through trial and error.
  3. Poor performance:
    1. Low training speed, training acceleration is very slow.
    2. The need for large amounts of data and training examples.
    3. The effect of catastrophic ‘forgetting’ and ‘freezing’ as tasks become more complex.
    4. Impossibility of up-training.