Here I’m answering to OP’s topic question rather than his exact problem. I’m doing this as the question shows up in the top when I google the topic problem.
You can implement a custom metric in two ways.
-
As mentioned in Keras docu.
import keras.backend as K def mean_pred(y_true, y_pred): return K.mean(y_pred) model.compile(optimizer="sgd", loss="binary_crossentropy", metrics=['accuracy', mean_pred])
But here you have to remember as mentioned in Marcin Możejko’s answer that
y_true
andy_pred
are tensors. So in order to correctly calculate the metric you need to usekeras.backend
functionality. Please look at this SO question for details How to calculate F1 Macro in Keras? -
Or you can implement it in a hacky way as mentioned in Keras GH issue. For that you need to use
callbacks
argument ofmodel.fit
.import keras as keras import numpy as np from keras.optimizers import SGD from sklearn.metrics import roc_auc_score model = keras.models.Sequential() # ... sgd = SGD(lr=0.001, momentum=0.9) model.compile(optimizer=sgd, loss="categorical_crossentropy", metrics=['accuracy']) class Metrics(keras.callbacks.Callback): def on_train_begin(self, logs={}): self._data = [] def on_epoch_end(self, batch, logs={}): X_val, y_val = self.validation_data[0], self.validation_data[1] y_predict = np.asarray(model.predict(X_val)) y_val = np.argmax(y_val, axis=1) y_predict = np.argmax(y_predict, axis=1) self._data.append({ 'val_rocauc': roc_auc_score(y_val, y_predict), }) return def get_data(self): return self._data metrics = Metrics() history = model.fit(X_train, y_train, epochs=100, validation_data=(X_val, y_val), callbacks=[metrics]) metrics.get_data()