Week 4 Quiz
Graded Quiz • 30 min Quiz5 Questions Week 4
Re⥃askly Logo

Week 4 Quiz

Graded Quiz • 30 min Quiz5 Questions Week 4
Practice More Quizzes:
Quiz
8 Questions Week 1
Quiz
8 Questions Week 2
Quiz
8 Questions Week 3
Quiz
5 Questions Week 4

Q:

When predicting words to generate poetry, the more words predicted the more likely it will end up gibberish. Why?

Q:

What is a major drawback of word-based training for text generation instead of character-based generation?

Q:

In natural language processing, predicting the next item in a sequence is a classification problem.Therefore, after creating inputs and labels from the subphrases, we one-hot encode the labels. What function do we use to create one-hot encoded arrays of the labels?

Q:

What are the critical steps in preparing the input sequences for the prediction model?

Q:

True or False: When building the model, we use a sigmoid activated Dense output layer with one neuron per word that lights up when we predict a given word.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
Find Questions in This Page: "CTRL+F"