Skip to content

Commit 0eab09e

Browse files
Update neural_network_future.py
Made AdamW the default optimizer. We need to parameterize this and an optional hyperparameter for the weight_decay.
1 parent e6ae27c commit 0eab09e

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

cerebros/neuralnetworkfuture/neural_network_future.py

+4-2
Original file line numberDiff line numberDiff line change
@@ -332,8 +332,10 @@ def compile_neural_network(self):
332332
self.materialized_neural_network.compile(
333333
loss=self.loss,
334334
metrics=self.metrics,
335-
optimizer=tf.keras.optimizers.Adam(
336-
learning_rate=self.learning_rate),
335+
optimizer=tf.keras.optimizers.AdamW(
336+
learning_rate=self.learning_rate,
337+
weight_decay=0.004 # Add weight decay parameter
338+
),
337339
jit_compile=jit_compile)
338340

339341
def util_parse_connectivity_csv(self):

0 commit comments

Comments
 (0)