In Keras, how is the hidden layer of a GRU model built?

Disable ads (and more) with a premium pass for a one time $4.99 payment

Study for the Google Cloud Professional Machine Learning Engineer Test. Study with flashcards and multiple choice questions, each question has hints and explanations. Get ready for your exam!

In Keras, when building a GRU (Gated Recurrent Unit) layer, the term "units" is used to specify the number of neurons in the hidden layer of the model. This terminology is consistent with the standard naming conventions in Keras, where "units" refers to the dimensionality of the output space of the layer.

By specifying “units”, you define the size of the output from the GRU, which directly correlates with how many units (or neurons) will process the input data at that layer. It's essential to configure this parameter correctly as it influences the model's ability to learn complex patterns from sequential data.

Using “neurons” would not be recognized in this context since the correct parameter in the GRU function is explicitly labeled as "units." Therefore, while the other options may hint at different aspects of modeling with GRU layers, option B accurately reflects the required syntax and parameter for defining the hidden layer's size in a GRU model.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy