Match the language model types with their corresponding description:
Language models before Transformers
LM types
Report a typo
Match the items from left and right columns
Unigram language model
N-gram language model
Exponential language model
Feed-forward neural probabilistic model
Recurrent neural network (RNN)
Utilizes the principle of maximum entropy and arbitrary functions to determine conditional probabilities.
Learns the parameters of the conditional probability distribution of the next word using a feed-forward neural network.
Approximates the probability of observing a sentence by considering the conditional probabilities of each token given the n-1 previous tokens.
Tries to capture dependencies over longer sequences.
Generates sentences token by token, considering the probabilities of individual tokens based on their occurrence in the text.
___
By continuing, you agree to the JetBrains Academy Terms of Service as well as Hyperskill Terms of Service and Privacy Policy.
Create a free account to access the full topic
By continuing, you agree to the JetBrains Academy Terms of Service as well as Hyperskill Terms of Service and Privacy Policy.