TimeDistributed(Dense) vs Dense in Keras – Same number of parameters

TimeDistributedDense applies a same dense to every time step during GRU/LSTM Cell unrolling. So the error function will be between predicted label sequence and the actual label sequence. (Which is normally the requirement for sequence to sequence labeling problems). However, with return_sequences=False, Dense layer is applied only once at the last cell. This is normally … Read more

How do I get the weights of a layer in Keras?

If you want to get weights and biases of all layers, you can simply use: for layer in model.layers: print(layer.get_config(), layer.get_weights()) This will print all information that’s relevant. If you want the weights directly returned as numpy arrays, you can use: first_layer_weights = model.layers[0].get_weights()[0] first_layer_biases = model.layers[0].get_weights()[1] second_layer_weights = model.layers[1].get_weights()[0] second_layer_biases = model.layers[1].get_weights()[1] etc.

Reset weights in Keras layer

Save the initial weights right after compiling the model but before training it: model.save_weights(‘model.h5’) and then after training, “reset” the model by reloading the initial weights: model.load_weights(‘model.h5’) This gives you an apples to apples model to compare different data sets and should be quicker than recompiling the entire model.

Negative dimension size caused by subtracting 3 from 1 for ‘conv2d_2/convolution’

By default, Convolution2D (https://keras.io/layers/convolutional/) expects the input to be in the format (samples, rows, cols, channels), which is “channels-last”. Your data seems to be in the format (samples, channels, rows, cols). You should be able to fix this using the optional keyword data_format=”channels_first” when declaring the Convolution2D layer. model.add(Convolution2D(32, (3, 3), activation=’relu’, input_shape=(1,28,28), data_format=”channels_first”))

How do you create a custom activation function with Keras?

Credits to this Github issue comment by Ritchie Ng. # Creating a model from keras.models import Sequential from keras.layers import Dense # Custom activation function from keras.layers import Activation from keras import backend as K from keras.utils.generic_utils import get_custom_objects def custom_activation(x): return (K.sigmoid(x) * 5) – 1 get_custom_objects().update({‘custom_activation’: Activation(custom_activation)}) # Usage model = Sequential() model.add(Dense(32, … Read more

When does keras reset an LSTM state?

Cheking with some tests, I got to the following conclusion, which is according to the documentation and to Nassim’s answer: First, there isn’t a single state in a layer, but one state per sample in the batch. There are batch_size parallel states in such a layer. Stateful=False In a stateful=False case, all the states are … Read more

Keras input explanation: input_shape, units, batch_size, dim, etc

Units: The amount of “neurons”, or “cells”, or whatever the layer has inside it. It’s a property of each layer, and yes, it’s related to the output shape (as we will see later). In your picture, except for the input layer, which is conceptually different from other layers, you have: Hidden layer 1: 4 units … Read more

tech