Which technique of word2vec uses surrounding words to predict a center word?

Disable ads (and more) with a premium pass for a one time $4.99 payment

Study for the Google Cloud Professional Machine Learning Engineer Test. Study with flashcards and multiple choice questions, each question has hints and explanations. Get ready for your exam!

The Continuous Bag-of-Words (CBOW) technique of word2vec is designed to predict a center word based on the surrounding context words. In this approach, a model takes a context of words (those adjacent to the target word) and uses them to predict the target word itself. This means that given multiple words around a particular word in a sentence, the CBOW model learns to identify what that center word should be, effectively capturing the contextual relationships between words.

In practice, the model averages the representations of the context words and uses this average to predict the center word. This capability highlights how language is structured, relying on the surrounding words to discern meaning and context.

On the other hand, Skip-gram, while closely related to word2vec, works in the opposite manner: it uses a center word to predict the surrounding context words. K-means is a clustering algorithm not directly related to word embeddings, and Hierarchical Softmax is a technique used in neural networks to compute probabilities but does not refer to the predictive approach of the words in context. Thus, CBOW clearly stands out as the technique that specifically predicts a center word from its surrounding words.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy