Java Code Examples for org.deeplearning4j.nn.weights.WeightInit#DISTRIBUTION

The following examples show how to use org.deeplearning4j.nn.weights.WeightInit#DISTRIBUTION . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: ImageUtils.java    From AILibs with GNU Affero General Public License v3.0 6 votes vote down vote up
public static PretrainedNNFilter getPretrainedNNFilterByName(final String name, final int layer,
                                                             final long[] shape) {
    // Thanks to a pointless API requirement, the zoo models require int[] shapes
    // while dl4j uses long[] at any other place
    final int[] intShape = Arrays.stream(shape).mapToInt(i -> (int) i).toArray();

    switch (name) {
        case "AlexNet":
            return new PretrainedNNFilter(new AlexNet(42, intShape, 10, new Nesterovs(1e-2, 0.9), CacheMode.NONE,
                    WorkspaceMode.ENABLED, AlgoMode.PREFER_FASTEST), layer, shape, name);
        case "LeNet":
            return new PretrainedNNFilter(new LeNet(42, intShape, 10, new Nesterovs(1e-2, 0.9), CacheMode.NONE,
                    WorkspaceMode.ENABLED, AlgoMode.PREFER_FASTEST), layer, shape, name);
        case "VGG19":
            return new PretrainedNNFilter(new VGG19(42, intShape, 10, new Nesterovs(1e-2, 0.9), CacheMode.NONE,
                    WorkspaceMode.ENABLED, AlgoMode.PREFER_FASTEST), layer, shape, name);
        case "ResNet50":
            return new PretrainedNNFilter(new ResNet50(42, intShape, 10, WeightInit.DISTRIBUTION,
                    new Nesterovs(1e-2, 0.9), CacheMode.NONE, WorkspaceMode.ENABLED, AlgoMode.PREFER_FASTEST), layer,
                    shape, name);
        default:
            return new PretrainedNNFilter(new VGG16(42, intShape, 10, new Nesterovs(1e-2, 0.9), CacheMode.NONE,
                    WorkspaceMode.ENABLED, AlgoMode.PREFER_FASTEST), layer, shape, "VGG16");
    }
}
 
Example 2
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
public GraphBuilder nOutReplace(String layerName, int nOut, WeightInit scheme, Distribution dist) {
    if(scheme == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException("Not supported!, Use " +
                "nOutReplace(layerNum, nOut, new WeightInitDistribution(dist), new WeightInitDistribution(distNext)) instead!");
    }
    return nOutReplace(layerName, nOut, scheme.getWeightInitFunction(), new WeightInitDistribution(dist));
}
 
Example 3
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
public GraphBuilder nOutReplace(String layerName, int nOut, Distribution dist, WeightInit scheme) {
    if(scheme == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException("Not supported!, Use " +
                "nOutReplace(layerNum, nOut, new WeightInitDistribution(dist), new WeightInitDistribution(distNext)) instead!");
    }
    return nOutReplace(layerName, nOut, new WeightInitDistribution(dist), scheme.getWeightInitFunction());
}
 
Example 4
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
public GraphBuilder nOutReplace(String layerName, int nOut, WeightInit scheme, WeightInit schemeNext) {
    if(scheme == WeightInit.DISTRIBUTION || schemeNext == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException("Not supported!, Use " +
                "nOutReplace(layerNum, nOut, new WeightInitDistribution(dist), new WeightInitDistribution(distNext)) instead!");
    }
    return nOutReplace(layerName, nOut, scheme.getWeightInitFunction(), schemeNext.getWeightInitFunction());
}
 
Example 5
Source File: BaseRecurrentLayer.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Set the weight initialization for the recurrent weights. Not that if this is not set explicitly, the same
 * weight initialization as the layer input weights is also used for the recurrent weights.
 *
 * @param weightInit Weight initialization for the recurrent weights only.
 */
public T weightInitRecurrent(WeightInit weightInit) {
    if (weightInit == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException(
                        "Not supported!, Use weightInit(Distribution distribution) instead!");
    }

    this.setWeightInitFnRecurrent(weightInit.getWeightInitFunction());
    return (T) this;
}
 
Example 6
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 3 votes vote down vote up
/**
 * Modify the architecture of a layer by changing nOut
 * Note this will also affect the layer that follows the layer specified, unless it is the output layer
 * Can specify different weight init schemes for the specified layer and the layer that follows it.
 *
 * @param layerNum   The index of the layer to change nOut of
 * @param nOut       Value of nOut to change to
 * @param scheme     Weight Init scheme to use for params in the layerNum
 * @param schemeNext Weight Init scheme to use for params in the layerNum+1
 * @return Builder
 */
public Builder nOutReplace(int layerNum, int nOut, WeightInit scheme, WeightInit schemeNext) {
    if(scheme == WeightInit.DISTRIBUTION || schemeNext == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException("Not supported!, Use " +
                "nOutReplace(layerNum, nOut, new WeightInitDistribution(dist), new WeightInitDistribution(distNext)) instead!");
    }
    return nOutReplace(layerNum, nOut, scheme.getWeightInitFunction(), schemeNext.getWeightInitFunction());
}
 
Example 7
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 3 votes vote down vote up
/**
 * Modify the architecture of a layer by changing nOut
 * Note this will also affect the layer that follows the layer specified, unless it is the output layer
 * Can specify different weight init schemes for the specified layer and the layer that follows it.
 *
 * @param layerNum The index of the layer to change nOut of
 * @param nOut     Value of nOut to change to
 * @param scheme   Weight init scheme to use for params in layerNum
 * @param distNext Distribution to use for parmas in layerNum+1
 * @return Builder
 * @see org.deeplearning4j.nn.weights.WeightInitDistribution
 */
public Builder nOutReplace(int layerNum, int nOut, WeightInit scheme, Distribution distNext) {
    if(scheme == WeightInit.DISTRIBUTION) {
        throw new UnsupportedOperationException("Not supported!, Use " +
                "nOutReplace(int layerNum, int nOut, Distribution dist, Distribution distNext) instead!");
    }
    return nOutReplace(layerNum, nOut, scheme.getWeightInitFunction(), new WeightInitDistribution(distNext));
}