LM types

Report a typo

Match the language model types with their corresponding description:

Match the items from left and right columns
Unigram language model
N-gram language model
Exponential language model
Feed-forward neural probabilistic model
Recurrent neural network (RNN)
Utilizes the principle of maximum entropy and arbitrary functions to determine conditional probabilities.
Learns the parameters of the conditional probability distribution of the next word using a feed-forward neural network.
Approximates the probability of observing a sentence by considering the conditional probabilities of each token given the n-1 previous tokens.
Tries to capture dependencies over longer sequences.
Generates sentences token by token, considering the probabilities of individual tokens based on their occurrence in the text.
___

Create a free account to access the full topic