Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow by Magnus Ekman

Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow by Magnus Ekman

Author:Magnus Ekman [Magnus Ekman]
Language: eng
Format: epub
Publisher: Addison-Wesley Professional
Published: 2021-08-08T16:00:00+00:00


Skip-Gram Model

A skip-gram model is an extension of the n-gram model but where all words do not need to appear sequentially in the training corpus. Instead, some words can be skipped. A k-skip-n-gram model is defined by the two parameters k and n, where k determines how many words can be skipped and n determines how many words each skip-gram contains. For instance, a 1-skip-2-gram model will contain all the bigrams (2-grams) that we discussed previous, but also contain nonconsecutive word pairs that are separated by, at most, one word. If we again consider the word sequence “The more I read, . . .” in addition to /the more/ /more i/, and so on, the 1-skip-2-gram model will contain /the i/ /more read/, and so on.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.