Яндекс Метрика
Языковая модель

SCRN (Structurally Constrained Recurrent Network)

Facebook AI Research
Языковое моделирование

Разработка Facebook AI Research, решающая классическую проблему «затухающего градиента» в рекуррентных сетях. SCRN эффективно улавливает долгосрочные закономерности в тексте, делая языковое моделирование на базе ИИ более точным и стабильным.

Recurrent neural network is a powerful model that learns temporal patterns in sequential data. For a long time, it was believed that recurrent networks are difficult to train using simple optimizers, such as stochastic gradient descent, due to the so-called vanishing gradient problem. In this paper, we show that learning longer term patterns in real data, such as in natural language, is perfectly possible using gradient descent. This is achieved by using a slight structural modification of the simple recurrent neural network architecture. We encourage some of the hidden units to change their state slowly by making part of the recurrent weight matrix close to identity, thus forming kind of a longer term memory. We evaluate our model in language modeling experiments, where we obtain similar performance to the much more complex Long Short Term Memory (LSTM) networks (Hochreiter & Schmidhuber, 1997).

Что такое SCRN (Structurally Constrained Recurrent Network)?+
Кто разработал SCRN (Structurally Constrained Recurrent Network)?+
Какие задачи решает SCRN (Structurally Constrained Recurrent Network)?+