ejemplo_pipeline_mlp module
- ejemplo_pipeline_mlp.optim_params = {'batch_size': (10, 64), 'epochs': 5, 'hidden_size': 16, 'lr': (0.0001, 0.1), 'n_layers': (1, 4), 'print_rate': 10, 'print_rate_epoch': 0}
if no optimization is needed, the training parameters can be defined