Literature DB >> 30150929

Knowledge Transfer Between Artificial Intelligence Systems.

Ivan Y Tyukin1,2, Alexander N Gorban1,2, Konstantin I Sofeykov1,3, Ilya Romanenko4.   

Abstract

We consider the fundamental question: how a legacy "student" Artificial Intelligent (AI) system could learn from a legacy "teacher" AI system or a human expert without re-training and, most importantly, without requiring significant computational resources. Here "learning" is broadly understood as an ability of one system to mimic responses of the other to an incoming stimulation and vice-versa. We call such learning an Artificial Intelligence knowledge transfer. We show that if internal variables of the "student" Artificial Intelligent system have the structure of an n-dimensional topological vector space and n is sufficiently high then, with probability close to one, the required knowledge transfer can be implemented by simple cascades of linear functionals. In particular, for n sufficiently large, with probability close to one, the "student" system can successfully and non-iteratively learn k ≪ n new examples from the "teacher" (or correct the same number of mistakes) at the cost of two additional inner products. The concept is illustrated with an example of knowledge transfer from one pre-trained convolutional neural network to another.

Entities:  

Keywords:  concentration of measure; error correction; knowledge transfer in artificial intelligence systems; neural networks; stochastic separation theorems; supervised learning

Year:  2018        PMID: 30150929      PMCID: PMC6099325          DOI: 10.3389/fnbot.2018.00049

Source DB:  PubMed          Journal:  Front Neurorobot        ISSN: 1662-5218            Impact factor:   2.650


  5 in total

1.  Training a support vector machine in the primal.

Authors:  Olivier Chapelle
Journal:  Neural Comput       Date:  2007-05       Impact factor: 2.026

2.  [Small experts and internal conflicts in studied neuronal networks].

Authors:  S E Gilev; A N Gorban'; E M Mirkes
Journal:  Dokl Akad Nauk SSSR       Date:  1991

3.  Stochastic separation theorems.

Authors:  A N Gorban; I Y Tyukin
Journal:  Neural Netw       Date:  2017-07-31

Review 4.  Blessing of dimensionality: mathematical foundations of the statistical physics of data.

Authors:  A N Gorban; I Y Tyukin
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2018-04-28       Impact factor: 4.226

5.  Rapid Encoding of New Memories by Individual Neurons in the Human Brain.

Authors:  Matias J Ison; Rodrigo Quian Quiroga; Itzhak Fried
Journal:  Neuron       Date:  2015-07-01       Impact factor: 17.173

  5 in total
  1 in total

Review 1.  High-Dimensional Brain in a High-Dimensional World: Blessing of Dimensionality.

Authors:  Alexander N Gorban; Valery A Makarov; Ivan Y Tyukin
Journal:  Entropy (Basel)       Date:  2020-01-09       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.