SimpleRNN classkeras.layers.SimpleRNN(
    units,
    activation="tanh",
    use_bias=True,
    kernel_initializer="glorot_uniform",
    recurrent_initializer="orthogonal",
    bias_initializer="zeros",
    kernel_regularizer=None,
    recurrent_regularizer=None,
    bias_regularizer=None,
    activity_regularizer=None,
    kernel_constraint=None,
    recurrent_constraint=None,
    bias_constraint=None,
    dropout=0.0,
    recurrent_dropout=0.0,
    return_sequences=False,
    return_state=False,
    go_backwards=False,
    stateful=False,
    unroll=False,
    seed=None,
    **kwargs
)
Fully-connected RNN where the output is to be fed back as the new input.
Arguments
tanh).
  If you pass None, no activation is applied
  (ie. "linear" activation: a(x) = x).True), whether the layer uses
  a bias vector.kernel weights matrix,
  used for the linear transformation of the inputs. Default:
  "glorot_uniform".recurrent_kernel
  weights matrix, used for the linear transformation of the recurrent
  state.  Default: "orthogonal"."zeros".kernel weights
  matrix. Default: None.recurrent_kernel weights matrix. Default: None.None.None.kernel weights
  matrix. Default: None.recurrent_kernel weights matrix.  Default: None.None.False.False.False).
  If True, process the input sequence backwards and return the
  reversed sequence.False). If True, the last state
  for each sample at index i in a batch will be used as the
  initial state for the sample of index i in the following batch.False).
  If True, the network will be unrolled,
  else a symbolic loop will be used.
  Unrolling can speed-up an RNN,
  although it tends to be more memory-intensive.
  Unrolling is only suitable for short sequences.Call arguments
[batch, timesteps, feature].[batch, timesteps] indicating whether
  a given timestep should be masked. An individual True entry
  indicates that the corresponding timestep should be utilized,
  while a False entry indicates that the corresponding timestep
  should be ignored.dropout or recurrent_dropout is used.Example
inputs = np.random.random((32, 10, 8))
simple_rnn = keras.layers.SimpleRNN(4)
output = simple_rnn(inputs)  # The output has shape `(32, 4)`.
simple_rnn = keras.layers.SimpleRNN(
    4, return_sequences=True, return_state=True
)
# whole_sequence_output has shape `(32, 10, 4)`.
# final_state has shape `(32, 4)`.
whole_sequence_output, final_state = simple_rnn(inputs)