MCQs on Natural Language Processing with answers pdf

Natural language processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence. NLP is concerned with the interactions between computers and human’s natural language like speech and text.

The Machine Learning’s Techniques are used for processing the Natural Language.

30 MCQs on Natural Language Processing (NLP) are posted in this post.

Multiple Choice Questions on Natural Language Processing

1. Document preprocessing steps are

 a. Tokenization

 b. Substitution

 c. Normalization

 d. Feature Selection

2. Pick the stemming actions

 a. was, am, are, is → be

 b. helped, helps → help

 c. troubled, troubling, trouble → trouble

 d. friend, friendship, friends, friendships → friend

 e. studied → studi

 f. All of the above

3. Pick the lemmatization actions

 a. was, am, are, is → be

 b. study, studying, studied → study

 c. troubled, troubling, trouble → trouble

 d. has have, had → have

 e. All of the above

4. Case folding is used for ___________________

 a. Normalization

 b. Tokenization

 c. Stemming

 d. Lemmatization

Yes, the answer is correct.

5. The start of the sentence is matched by ____________________ and the end of the sentence is matched by ________________________

a.  ^ and $

 b. $ and ^

 c. \$ and .

 d. \^ and \.

6. The raw Co-occurrence matrices are usually sparse

 a. True

 b. False

7. Is one hot-vector a dense vector?

 a. True

 b. False

8. In Naive Bayes classification, Posterior probability is estimated for predicting the class?

 True

 False

9. Which one of the following is linearly inseparable?

 a. NAND

 b. NOR

 c. Compliment of XOR

 d. None of the above

10. A sigmoid function squashes the values in the range of [01]

a.  True

 b. False

11. Perceptron classifies data with more than 2 classes

 a. True

 b. False

12. What is the height of the balanced binary tree. if the vocabulary size is 128?

 a. 64

 b. 32

 c. 7

 d. None of the above

13. The statement “Hierarchical Softmax does not provide a unique path to each word in the vocabulary” is

a.  True

 b. False

14. In the hierarchical Softmax model, the ANN learns the probabilistic decisions at every node. Is the statement true?

 a. True

 b. False

15. The most frequent words are kept at a shorter distance from the root by ________________

 a. Transform coding

 b. Huffman encoding

 c. Run-length encoding

 d. None of the above

16. Is the statement ”Hierarchical Softmax increases the computation complexity when compared to Softmax” true?

 a. True

 b. False

17. The RNN imposes a constraint on the length of the input of words. Is the above statement correct?

 a. Yes

 b. No

18. What is the most important architectural change introduced in the recurrent neural network?

 a. Multiple hidden layers

 b. Word embedding as input

 c. Hierarchical Softmax layer

 d. State vector

 None of the above

19. LSTM introduces a _________________ at the hidden layer

 a. State vector

b.  Memory vector

 c. Word Vector

 d. None of the above

Yes, the answer is correct.

20. In phrase-based translation, only linguistic phrases are considered

 a. True

 b. False

No, the answer is incorrect.

21. In phrase-based translation, the whole sentence could be considered as a phrase

 a. True

 b. False

22. The process of symmetrization made ”many to one” and ”one to many” alignments possible

 a. True

 b. False

23. Phrase-based translation does not use Noisy-Channel model

 a. True

 b. False

24. In the encoder, the fixed length vector at the output represents ________________

 a. Word vector

 b. Encoding of the input sentence

 c. Translation of the sentence

 d. Softmax containing the probability score of all words in the vocabulary

25. The encoder and decoder are trained independently

 a. True

 b. False

26. In the global attention model, the decoder selectively learns to align words

 a. True

 b. False

27. In the attention model (global and local), the attention

 a. True

 b. False

28. Sequence-to-sequence translation represents a conditional language model

 a. True

 b. False

29. Beam search is a greedy search algorithm

 a. True

 b. False

30. Beam search guarantees the best translation for SMT and NMT

 a. True

 b. False

Leave a Comment