bg

60 models

LSTM (Long Short-Term Memory)

LSTM is a type of recurrent neural network (RNN) that is designed to
overcome the vanishing gradient problem. It is commonly used for sequence prediction tasks,
such as speech recognition, language modeling, and sentiment analysis.

GRU (Gated Recurrent Unit)

GRU is another type of recurrent neural network that addresses the vanishing gradient problem.
It simplifies the architecture of LSTM by combining the forget and input gates.

MLP (Multilayer Perceptron)

MLP models, also known as feedforward neural networks, can be used for time series forecasting by
taking historical data as input and predicting future values. They are relatively simple models but
can still provide accurate predictions in some cases.

Seq2Seq (Sequence-to-Sequence)

Seq2Seq models are trained in a teacher-forcing manner, where during training, the correct output
sequence is fed as input to the decoder at each step. The model is optimized to minimize
the difference between the predicted output sequence and the target output sequence.

Bayesian Recurrent Neural Network (BRNN)

BRNN is a recurrent neural network model that incorporates Bayesian inference to model and predict
the Euro price with uncertainty estimates.

Neural Prophet

Neural Prophet is a deep learning model specifically designed for time series forecasting tasks.
It is based on feed-forward neural networks and incorporates several components
to capture different aspects of time series data.