Model
Model(
run_outputs, run_inputs = None, train_outputs = None, train_inputs = None,
train_loss = None, eval_inputs = None, eval_outputs = None, eval_score = None,
name = 'Model'
)
Base Model
Args
- train_loss : either a callable, layer, or dictionary mapping a callable or
- train_inputs : defaults to run inputs, if loss is provided you can either layer to the target outputs.
supply the inputs to the train graph that include the loss, or let the Model create inputs for you.
Methods:
.draw
.draw(
path = 'graph.pdf'
)
.set_optimizer
.set_optimizer(
optimizer, **config
)
Set the optimizer for this model
Optimizer Hyper-Parameters
The arguments passed to the optimizer constructor can be either regular Python values,
tensors, or a callable
. If they are callable, they will called during apply_gradients()
to get the value for the hyper parameter.
Args
- optimizer (Optimizer) : optimizer class or instance
- config : dictionary with parameters for the optimizer, if you want to modify these parameters during
training pass an
tx.Param
as the value for the given parameter instead of constant value.
Returns
- optimizer (Optimizer) : the configured optimizer instance.
.run
.run(
input_feed, compiled_graph = False
)
.train_step
.train_step(
input_feed
)
.eval_step
.eval_step(
input_feed
)
Args
input_feed:
Returns
*eval_output, eval_score ((eval outputs,eval score)):
.train
.train(
train_data, validation_data = None, test_data = None, epochs = 1,
steps_per_epoch = None, callbacks = []
)
Main training loop
Args
- train_data : an iterable of dictionaries from Input Layers to values {Input:data}.
- validation_data : an iterable of dictionaries from Input Layers to values {Input:data}.
- test_data : an iterable of dictionaries from Input Layers to values {Input:data}.
- epochs (int) : number of training epochs.
- steps_per_epoch : number of steps in an epoch, if not None, epochs are incremented each time
- callbacks :
Callback
functions scheduled during the training. (calling iter on this object should yield an iterator for an epoch.)
this number of steps pass even if the entire train_data has not been transversed.