Skip to content

RNN

source

RNN(
   input_seq, previous_state = None, cell_config: Callable[[Union[Layer,
   tf.Tensor]], BaseRNNCell] = None, n_units = None, reverse = False,
   regularized = False, stateful = False, return_state = False, name = 'rnn_layer',
   share_state_with: Optional['RNN'] = None
)

Recurrent Layer

Takes a batch of sequences in time-major order [time_step,batch_size,feature_size] and dynamically unrolls a RecurrentCell applying it to each time step. The sequence should have at least one time step for which the recurrent cell is first created. After that, it supports an Unknown number of time steps. (time_step>=1)

Args

  • input_seq : a Layer whose tensor has the shape [time_step,batch_size,feature_size] with time_step>=1

Attributes

  • cell : a Layer of type RecurrentCell used in the unrolled steps
  • cell_config (Callable[Layer]) : a function returning a recurrent cellLayer` when applied to an input or tensor. This can be solved by creating a lambda with the sell parameters or a partial

Methods:

.compute_shape

source

.compute_shape()

.init_state

source

.init_state()

Create a recurrent cell from the given config

Dev note

The only stateful thing here is the cell which is a layer. Since layers Need to know their input layer for their state to be initialized, we need to give the cell a dummy input.

Returns

  • state (LayerState) : a state with a cell layer that performs the computations

.compute

source

.compute(
   input_seq, *prev_state
)

.reuse_with

source

.reuse_with(
   input_seq, *previous_state, regularized = None, reverse = None, stateful = None,
   return_state = None, name = None
)

.reset

source

.reset()