課程名稱︰自然語言處理
課程性質︰系選修
課程教師︰陳信希
開課學院:電資學院
開課系所︰資訊系
考試日期(年月日)︰2012/11/15
考試時限(分鐘):170
是否需發放獎勵金:是的,感謝
(如未明確表示,則不予發放)
試題 :
(1) Ambiguity resolution is a well-known problem in natural language processing
Please give an ambiguous example on the lexical, syntactic and semantic
levels, respectiely. (10 points)
(2) NLP can be regarded as notation transformation. Please use the following
sentence to describe its interpretation on POS, syntax, and entity
extraction layers. (10 points)
Sheikh Mohammed announced at the ceremony "we want to make Dubai a new
trading center."
(3) Please describe how to model machine translation, part of speech tagging,
and speech recognition as decoding problems. (10 points)
(4)(a) What is collocation? (5 points)
(b) Please describe hwo log likelihood ratio is used to extract new terms.
(7 points)
(5)(a) What is n-gram model? (5 points)
(b) An archaeologist claimed he found a new Shakespeare's work. Please
propose a method to tell how probably the work is written by
Shakespeare. (7 points)
(6) Imagine you are fishing. There are only 8 species (carp, perch, whitefish,
trout, salmon, eel, catfish, bass) in the world. You have caught up to now:
10 carp, 3 perch, 2 whitefish, 1 trout, 1 salmon, and 1 eel. Please use
the following methods to answer the two questions (i) how likely is it that
the next fish to be caught is a whitefish? (ii) How likely is it that the
next fish caught will be a member of newly seen species? (18 points)
(a) Maximum Likelihood Estimation,(b) Laplace,(c) Good Turing
(7) In Hidden Markov Model(HMM), forward probability and backward probability
are defined respectively as follows.
αt(j) = P(o1,o2...ot,qt=j|λ)
βt(i) = P(ot+1,ot+2,...oT,qt=i,λ)
(a) Please first formulate αt(j) in terms of the previous forward
probability, the transition probability, and the state observation
likelihood, and then discuss how it can be used to deal with combinatorial
explosion problem in computing the probability of an observation.
(10 points)
(b) Please formulate βt(i) in terms of the next backward probability, the
transition probability, and the state observation likelihood. (6 points)
(c) Please describe hwo to use forward probability and backward probability
to compute the transition probability in HMM training (6 points)
(d) Given the following HMM, please determine the best state (HOT/COLD)
sequence of the observation 231 without enumerating all the paths.
(6 points)
有圖 改天補
(8) Voice QAs such as Apple Siri, Google Now, etc ara some recent applications
of natural language processing. Please describe three fundamental NLP
functions behind the services. (bonus, 10 points)
--
※ 發信站: 批踢踢實業坊(ptt.cc)
◆ From: 123.193.6.232