Python keras.initializers() Examples

The following are 2 code examples of keras.initializers(). You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may also want to check out all available functions/classes of the module keras , or try the search function .
Example #1
Source File: graph_yoon_kim.py    From Keras-TextClassification with MIT License 5 votes vote down vote up
def highway_keras(x):
    # writter by my own
    # paper; Highway Network(http://arxiv.org/abs/1505.00387).
    # 公式
    # 1. s = sigmoid(Wx + b)
    # 2. z = s * relu(Wx + b) + (1 - s) * x
    # x shape : [N * time_depth, sum(filters)]

    # Table 1. CIFAR-10 test set accuracy of convolutional highway networks with
    # rectified linear activation and sigmoid gates.
    # For comparison, results reported by Romero et al. (2014)
    # using maxout networks are also shown.
    # Fitnets were trained using a two step training procedure using soft targets from the trained Teacher network,
    # which was trained using backpropagation. We trained all highway networks directly using backpropagation.
    # * indicates networks which were trained only on a set of 40K out of 50K examples in the training set.



    # Figure 2. Visualization of certain internals of the blocks in the best 50 hidden layer highway networks trained on MNIST
    # (top row) and CIFAR-100 (bottom row). The first hidden layer is a plain layer which changes the dimensionality of the representation to 50. Each of
    # the 49 highway layers (y-axis) consists of 50 blocks (x-axis).
    # The first column shows the transform gate biases, which were initialized to -2 and -4 respectively.
    # In the second column the mean output of the transform gate over 10,000 training examples is depicted.
    # The third and forth columns show the output of the transform gates and
    # the block outputs for a single random training sample.

    gate_transform = Dense(units=K.int_shape(x)[1],
                           activation='sigmoid',
                           use_bias=True,
                           kernel_initializer='glorot_uniform',
                           bias_initializer=keras.initializers.Constant(value=-2))(x)
    gate_cross = 1 - gate_transform
    block_state = Dense(units=K.int_shape(x)[1],
                        activation='relu',
                        use_bias=True,
                        kernel_initializer='glorot_uniform',
                        bias_initializer='zero')(x)
    high_way = gate_transform * block_state + gate_cross * x

    return high_way 
Example #2
Source File: __init__.py    From deep_complex_networks with MIT License 4 votes vote down vote up
def get_deep_convnet(window_size=4096, channels=2, output_size=84):
    inputs = Input(shape=(window_size, channels))
    outs = inputs

    outs = (ComplexConv1D(
        16, 6, strides=2, padding='same',
        activation='linear',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexBN(axis=-1))(outs)
    outs = (keras.layers.Activation('relu'))(outs)
    outs = (keras.layers.AveragePooling1D(pool_size=2, strides=2))(outs)

    outs = (ComplexConv1D(
        32, 3, strides=2, padding='same',
        activation='linear',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexBN(axis=-1))(outs)
    outs = (keras.layers.Activation('relu'))(outs)
    outs = (keras.layers.AveragePooling1D(pool_size=2, strides=2))(outs)
    
    outs = (ComplexConv1D(
        64, 3, strides=1, padding='same',
        activation='linear',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexBN(axis=-1))(outs)
    outs = (keras.layers.Activation('relu'))(outs)
    outs = (keras.layers.AveragePooling1D(pool_size=2, strides=2))(outs)

    outs = (ComplexConv1D(
        64, 3, strides=1, padding='same',
        activation='linear',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexBN(axis=-1))(outs)
    outs = (keras.layers.Activation('relu'))(outs)
    outs = (keras.layers.AveragePooling1D(pool_size=2, strides=2))(outs)

    outs = (ComplexConv1D(
        128, 3, strides=1, padding='same',
        activation='relu',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexConv1D(
        128, 3, strides=1, padding='same',
        activation='linear',
        kernel_initializer='complex_independent'))(outs)
    outs = (ComplexBN(axis=-1))(outs)
    outs = (keras.layers.Activation('relu'))(outs)
    outs = (keras.layers.AveragePooling1D(pool_size=2, strides=2))(outs)

    #outs = (keras.layers.MaxPooling1D(pool_size=2))
    #outs = (Permute([2, 1]))
    outs = (keras.layers.Flatten())(outs)
    outs = (keras.layers.Dense(2048, activation='relu',
                           kernel_initializer='glorot_normal'))(outs)
    predictions = (keras.layers.Dense(output_size, activation='sigmoid',
                                 bias_initializer=keras.initializers.Constant(value=-5)))(outs)

    model = Model(inputs=inputs, outputs=predictions)
    model.compile(optimizer=keras.optimizers.Adam(lr=1e-4),
                  loss='binary_crossentropy',
                  metrics=['accuracy'])
    return model