Practice More Questions From: Week 3 Quiz
Q:
Why does sequence make a large difference when determining semantics of language?
Q:
How do Recurrent Neural Networks help you understand the impact of sequence on meaning?
Q:
How does an LSTM help understand meaning when words that qualify each other aren’t necessarily beside each other in a sentence?
Q:
What keras layer type allows LSTMs to look forward and backward in a sentence?
Q:
What’s the output shape of a bidirectional LSTM layer with 64 units?
Q:
If a sentence has 120 tokens in it, and a Conv1D with 128 filters with a Kernal size of 5 is passed over it, what’s the output shape?
Q:
When stacking LSTMs, how do you instruct an LSTM to feed the next one in the sequence?
Q:
What’s the best way to avoid overfitting in NLP datasets?
Subscribe
0 Comments