Java Code Examples for org.nd4j.linalg.api.ndarray.INDArray#isVector()

The following examples show how to use org.nd4j.linalg.api.ndarray.INDArray#isVector() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: RegressionMetrics.java    From konduit-serving with Apache License 2.0 6 votes vote down vote up
private void handleNdArray(INDArray output) {
    if(output.isVector()) {
        for(int i = 0; i < output.length(); i++) {
            statCounters.get(i).add(output.getDouble(i));
        }
    }
    else if(output.isMatrix() && output.length() > 1) {
        for(int i = 0; i < output.rows(); i++) {
            for(int j = 0; j < output.columns(); j++) {
                statCounters.get(i).add(output.getDouble(i,j));
            }
        }
    }
    else if(output.isScalar()) {
        statCounters.get(0).add(output.sumNumber().doubleValue());
    }
    else {
        throw new IllegalArgumentException("Only vectors and matrices supported right now");
    }
}
 
Example 2
Source File: NearestNeighbor.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
public List<NearestNeighborsResult> search() {
    INDArray input = points.slice(record.getInputIndex());
    List<NearestNeighborsResult> results = new ArrayList<>();
    if (input.isVector()) {
        List<DataPoint> add = new ArrayList<>();
        List<Double> distances = new ArrayList<>();
        tree.search(input, record.getK(), add, distances);

        if (add.size() != distances.size()) {
            throw new IllegalStateException(
                    String.format("add.size == %d != %d == distances.size",
                            add.size(), distances.size()));
        }

        for (int i=0; i<add.size(); i++) {
            results.add(new NearestNeighborsResult(add.get(i).getIndex(), distances.get(i)));
        }
    }



    return results;

}
 
Example 3
Source File: OpExecutionerUtil.java    From nd4j with Apache License 2.0 6 votes vote down vote up
/**Choose tensor dimension for operations with 3 arguments: z=Op(x,y) or similar<br>
 * @see #chooseElementWiseTensorDimension(INDArray)
 */
public static int chooseElementWiseTensorDimension(INDArray x, INDArray y, INDArray z) {
    if (x.isVector())
        return ArrayUtil.argMax(x.shape());

    // FIXME: int cast

    int opAlongDimensionMinStride = (int) ArrayUtil.argMinOfMax(x.stride(), y.stride(), z.stride());

    int opAlongDimensionMaxLength = ArrayUtil.argMax(x.shape());
    //Edge case: shapes with 1s in them can have stride of 1 on the dimensions of length 1
    if (opAlongDimensionMinStride == opAlongDimensionMaxLength || x.size((int) opAlongDimensionMinStride) == 1)
        return opAlongDimensionMaxLength;

    int nOpsAlongMinStride = ArrayUtil.prod(ArrayUtil.removeIndex(x.shape(), (int) opAlongDimensionMinStride));
    int nOpsAlongMaxLength = ArrayUtil.prod(ArrayUtil.removeIndex(x.shape(), opAlongDimensionMaxLength));
    if (nOpsAlongMinStride <= 10 * nOpsAlongMaxLength)
        return opAlongDimensionMinStride;
    else
        return opAlongDimensionMaxLength;
}
 
Example 4
Source File: Shape.java    From nd4j with Apache License 2.0 6 votes vote down vote up
/** Check if strides are in order suitable for non-strided mmul etc.
 * Returns true if c order and strides are descending [100,10,1] etc
 * Returns true if f order and strides are ascending [1,10,100] etc
 * False otherwise.
 * @return true if c+descending, f+ascending, false otherwise
 */
public static boolean strideDescendingCAscendingF(INDArray array) {
    long[] strides = array.stride();
    if (array.isVector() && strides[0] == 1 && strides[1] == 1)
        return true;
    char order = array.ordering();

    if (order == 'c') { //Expect descending. [100,10,1] etc
        for (int i = 1; i < strides.length; i++)
            if (strides[i - 1] <= strides[i])
                return false;
        return true;
    } else if (order == 'f') {//Expect ascending. [1,10,100] etc
        for (int i = 1; i < strides.length; i++)
            if (strides[i - 1] >= strides[i])
                return false;
        return true;
    } else if (order == 'a') {
        return true;
    } else {
        throw new RuntimeException("Invalid order: not c or f (is: " + order + ")");
    }
}
 
Example 5
Source File: Shape.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/** Check if strides are in order suitable for non-strided mmul etc.
 * Returns true if c order and strides are descending [100,10,1] etc
 * Returns true if f order and strides are ascending [1,10,100] etc
 * False otherwise.
 * @return true if c+descending, f+ascending, false otherwise
 */
public static boolean strideDescendingCAscendingF(INDArray array) {
    if(array.rank() <= 1)
        return true;
    long[] strides = array.stride();
    if (array.isVector() && strides[0] == 1 && strides[1] == 1)
        return true;
    char order = array.ordering();

    if (order == 'c') { //Expect descending. [100,10,1] etc
        for (int i = 1; i < strides.length; i++)
            if (strides[i - 1] <= strides[i])
                return false;
        return true;
    } else if (order == 'f') {//Expect ascending. [1,10,100] etc
        for (int i = 1; i < strides.length; i++)
            if (strides[i - 1] >= strides[i])
                return false;
        return true;
    } else if (order == 'a') {
        return true;
    } else {
        throw new RuntimeException("Invalid order: not c or f (is: " + order + ")");
    }
}
 
Example 6
Source File: SpTree.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 *
 * Compute edge forces using barnes hut
 * @param rowP a vector
 * @param colP
 * @param valP
 * @param N the number of elements
 * @param posF the positive force
 */
public void computeEdgeForces(INDArray rowP, INDArray colP, INDArray valP, int N, INDArray posF) {
    if (!rowP.isVector())
        throw new IllegalArgumentException("RowP must be a vector");

    // Loop over all edges in the graph
    // just execute native op
    Nd4j.exec(new BarnesEdgeForces(rowP, colP, valP, data, N, posF));

    /*
    INDArray buf = Nd4j.create(data.dataType(), this.D);
    double D;
    for (int n = 0; n < N; n++) {
        INDArray slice = data.slice(n);
        for (int i = rowP.getInt(n); i < rowP.getInt(n + 1); i++) {

            // Compute pairwise distance and Q-value
            slice.subi(data.slice(colP.getInt(i)), buf);

            D = 1.0 + Nd4j.getBlasWrapper().dot(buf, buf);
            D = valP.getDouble(i) / D;

            // Sum positive force
            posF.slice(n).addi(buf.muli(D));
        }
    }
    */
}
 
Example 7
Source File: TimeSeriesUtils.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Reshape time series mask arrays. This should match the assumptions (f order, etc) in RnnOutputLayer
 * @param timeSeriesMaskAsVector    Mask array to reshape to a column vector
 * @return                  Mask array as a column vector
 */
public static INDArray reshapeVectorToTimeSeriesMask(INDArray timeSeriesMaskAsVector, int minibatchSize) {
    if (!timeSeriesMaskAsVector.isVector())
        throw new IllegalArgumentException("Cannot reshape mask: expected vector");

    val timeSeriesLength = timeSeriesMaskAsVector.length() / minibatchSize;

    return timeSeriesMaskAsVector.reshape('f', minibatchSize, timeSeriesLength);
}
 
Example 8
Source File: FeedForwardToRnnPreProcessor.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
@Override
public Pair<INDArray, MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState,
                int minibatchSize) {
    //Assume mask array is 1d - a mask array that has been reshaped from [minibatch,timeSeriesLength] to [minibatch*timeSeriesLength, 1]
    if (maskArray == null) {
        return new Pair<>(maskArray, currentMaskState);
    } else if (maskArray.isVector()) {
        //Need to reshape mask array from [minibatch*timeSeriesLength, 1] to [minibatch,timeSeriesLength]
        return new Pair<>(TimeSeriesUtils.reshapeVectorToTimeSeriesMask(maskArray, minibatchSize),
                        currentMaskState);
    } else {
        throw new IllegalArgumentException("Received mask array with shape " + Arrays.toString(maskArray.shape())
                        + "; expected vector.");
    }
}
 
Example 9
Source File: BaseNDArrayFactory.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Reverses the passed in matrix such that m[0] becomes m[m.length - 1] etc
 *
 * @param reverse the matrix to reverse
 * @return the reversed matrix
 */
@Override
public INDArray rot(INDArray reverse) {
    INDArray ret = Nd4j.create(reverse.shape());
    if (reverse.isVector())
        return reverse(reverse);
    else {
        for (int i = 0; i < reverse.slices(); i++) {
            ret.putSlice(i, reverse(reverse.slice(i)));
        }
    }
    return ret.reshape(reverse.shape());
}
 
Example 10
Source File: BaseNDArrayFactory.java    From nd4j with Apache License 2.0 5 votes vote down vote up
/**
 * Reverses the passed in matrix such that m[0] becomes m[m.length - 1] etc
 *
 * @param reverse the matrix to reverse
 * @return the reversed matrix
 */
@Override
public INDArray rot(INDArray reverse) {
    INDArray ret = Nd4j.create(reverse.shape());
    if (reverse.isVector())
        return reverse(reverse);
    else {
        for (int i = 0; i < reverse.slices(); i++) {
            ret.putSlice(i, reverse(reverse.slice(i)));
        }
    }
    return ret.reshape(reverse.shape());
}
 
Example 11
Source File: OpExecutionerUtil.java    From nd4j with Apache License 2.0 5 votes vote down vote up
/**
 *
 * Choose tensor dimension for operations with one argument: x=Op(x) or similar<br>
 * When doing some operations in parallel, it is necessary to break up
 * operations along a dimension to
 * give a set of 1d tensors. The dimension that this is done on is important for performance reasons;
 * in summary we want to both minimize the number of tensors
 * , but also minimize the separation between
 * elements in the buffer (so the resulting operation is efficient - i.e., avoids cache thrashing).
 * However, achieving both minimal number
 * of tensors and are not always possible.
 * @param x NDArray that we want to split
 * @return The best dimension to split on
 */
public static int chooseElementWiseTensorDimension(INDArray x) {
    if (x.isVector())
        return ArrayUtil.argMax(x.shape()); //Execute along the vector

    //doing argMin(max(x.stride(i),y.stride(i))) minimizes the maximum
    //separation between elements (helps CPU cache) BUT might result in a huge number
    //of tiny ops - i.e., addi on NDArrays with shape [5,10^6]
    int opAlongDimensionMinStride = ArrayUtil.argMin(x.stride());

    //doing argMax on shape gives us smallest number of largest tensors
    //but may not be optimal in terms of element separation (for CPU cache etc)
    int opAlongDimensionMaxLength = ArrayUtil.argMax(x.shape());

    //Edge cases: shapes with 1s in them can have stride of 1 on the dimensions of length 1
    if (x.isVector() || x.size(opAlongDimensionMinStride) == 1)
        return opAlongDimensionMaxLength;

    //Using a heuristic approach here: basically if we get >= 10x as many tensors using the minimum stride
    //dimension vs. the maximum size dimension, use the maximum size dimension instead
    //The idea is to avoid choosing wrong dimension in cases like shape=[10,10^6]
    //Might be able to do better than this with some additional thought
    int nOpsAlongMinStride = ArrayUtil.prod(ArrayUtil.removeIndex(x.shape(), opAlongDimensionMinStride));
    int nOpsAlongMaxLength = ArrayUtil.prod(ArrayUtil.removeIndex(x.shape(), opAlongDimensionMaxLength));
    if (nOpsAlongMinStride <= 10 * nOpsAlongMaxLength)
        return opAlongDimensionMinStride;
    else
        return opAlongDimensionMaxLength;
}
 
Example 12
Source File: OpExecutionerUtil.java    From nd4j with Apache License 2.0 5 votes vote down vote up
/** Can we do the transform op (Z = Op(X,Y)) directly on the arrays without breaking them up into 1d tensors first? */
public static boolean canDoOpDirectly(INDArray x, INDArray y, INDArray z) {
    if (x.isVector())
        return true;
    if (x.ordering() != y.ordering() || x.ordering() != z.ordering())
        return false; //other than vectors, elements in f vs. c NDArrays will never line up
    if (x.elementWiseStride() < 1 || y.elementWiseStride() < 1)
        return false;
    //Full buffer + matching strides -> implies all elements are contiguous (and match)
    long l1 = x.lengthLong();
    long dl1 = x.data().length();
    long l2 = y.lengthLong();
    long dl2 = y.data().length();
    long l3 = z.lengthLong();
    long dl3 = z.data().length();
    long[] strides1 = x.stride();
    long[] strides2 = y.stride();
    long[] strides3 = z.stride();
    boolean equalStrides = Arrays.equals(strides1, strides2) && Arrays.equals(strides1, strides3);
    if (l1 == dl1 && l2 == dl2 && l3 == dl3 && equalStrides)
        return true;

    //Strides match + are same as a zero offset NDArray -> all elements are contiguous (and match)
    if (equalStrides) {
        long[] shape1 = x.shape();
        long[] stridesAsInit = (x.ordering() == 'c' ? ArrayUtil.calcStrides(shape1)
                        : ArrayUtil.calcStridesFortran(shape1));
        boolean stridesSameAsInit = Arrays.equals(strides1, stridesAsInit);
        return stridesSameAsInit;
    }

    return false;
}
 
Example 13
Source File: MLLibUtil.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Convert an ndarray to a vector
 * @param arr the array
 * @return an mllib vector
 */
public static Vector toVector(INDArray arr) {
    if (!arr.isVector()) {
        throw new IllegalArgumentException("passed in array must be a vector");
    }
    if (arr.length() > Integer.MAX_VALUE)
        throw new ND4JArraySizeException();
    double[] ret = new double[(int) arr.length()];
    for (int i = 0; i < arr.length(); i++) {
        ret[i] = arr.getDouble(i);
    }

    return Vectors.dense(ret);
}
 
Example 14
Source File: LinAlgExceptions.java    From nd4j with Apache License 2.0 4 votes vote down vote up
public static void assertVector(INDArray arr) {
    if (!arr.isVector())
        throw new IllegalArgumentException("Array must be a vector. Array has shape: " + Arrays.toString(arr.shape()));
}
 
Example 15
Source File: LinAlgExceptions.java    From deeplearning4j with Apache License 2.0 4 votes vote down vote up
public static void assertVector(INDArray arr) {
    if (!arr.isVector())
        throw new IllegalArgumentException("Array must be a vector. Array has shape: " + Arrays.toString(arr.shape()));
}
 
Example 16
Source File: NDArrayIndex.java    From nd4j with Apache License 2.0 3 votes vote down vote up
/**
 * Create from a matrix. The rows are the indices
 * The columns are the individual element in each ndarrayindex
 *
 * @param index the matrix to getFloat indices from
 * @return the indices to getFloat
 */
public static INDArrayIndex[] create(INDArray index) {

    if (index.isMatrix()) {

        if (index.rows() > Integer.MAX_VALUE)
            throw new ND4JArraySizeException();

        NDArrayIndex[] ret = new NDArrayIndex[(int) index.rows()];
        for (int i = 0; i < index.rows(); i++) {
            INDArray row = index.getRow(i);
            val nums = new long[(int) index.getRow(i).columns()];
            for (int j = 0; j < row.columns(); j++) {
                nums[j] = (int) row.getFloat(j);
            }

            NDArrayIndex idx = new NDArrayIndex(nums);
            ret[i] = idx;

        }


        return ret;

    } else if (index.isVector()) {
        long[] indices = NDArrayUtil.toLongs(index);
        return new NDArrayIndex[] {new NDArrayIndex(indices)};
    }


    throw new IllegalArgumentException("Passed in ndarray must be a matrix or a vector");

}
 
Example 17
Source File: BlasBufferUtil.java    From nd4j with Apache License 2.0 3 votes vote down vote up
/**
 * Get the leading dimension
 * for a blas invocation.
 *
 * The lead dimension is usually
 * arr.size(0) (this is only for fortran ordering though).
 * It can be size(1) (assuming matrix) for C ordering though.
 * @param arr the array to
 * @return the leading dimension wrt the ordering of the array
 *
 */
public static int getLd(INDArray arr) {
    //ignore ordering for vectors
    if (arr.isVector()) {
        return (int) arr.length();
    }

    return arr.ordering() == NDArrayFactory.C ? (int) arr.size(1) : (int) arr.size(0);
}
 
Example 18
Source File: BlasBufferUtil.java    From nd4j with Apache License 2.0 3 votes vote down vote up
/**
 * Get the dimension associated with
 * the given ordering.
 *
 * When working with blas routines, they typically assume
 * c ordering, instead you can invert the rows/columns
 * which enable you to do no copy blas operations.
 *
 *
 *
 * @param arr
 * @param defaultRows
 * @return
 */
public static int getDimension(INDArray arr, boolean defaultRows) {
    // FIXME: int cast

    //ignore ordering for vectors
    if (arr.isVector()) {
        return defaultRows ? (int) arr.rows() : (int) arr.columns();
    }
    if (arr.ordering() == NDArrayFactory.C)
        return defaultRows ? (int) arr.columns() : (int) arr.rows();
    return defaultRows ? (int) arr.rows() : (int) arr.columns();
}
 
Example 19
Source File: BlasBufferUtil.java    From deeplearning4j with Apache License 2.0 3 votes vote down vote up
/**
 * Get the dimension associated with
 * the given ordering.
 *
 * When working with blas routines, they typically assume
 * c ordering, instead you can invert the rows/columns
 * which enable you to do no copy blas operations.
 *
 *
 *
 * @param arr
 * @param defaultRows
 * @return
 */
public static long getDimension(INDArray arr, boolean defaultRows) {

    //ignore ordering for vectors
    if (arr.isVector()) {
        return defaultRows ? arr.rows() : arr.columns();
    }
    if (arr.ordering() == NDArrayFactory.C)
        return defaultRows ? arr.columns() : arr.rows();
    return defaultRows ? arr.rows() : arr.columns();
}
 
Example 20
Source File: BlasBufferUtil.java    From deeplearning4j with Apache License 2.0 3 votes vote down vote up
/**
 * Get the leading dimension
 * for a blas invocation.
 *
 * The lead dimension is usually
 * arr.size(0) (this is only for fortran ordering though).
 * It can be size(1) (assuming matrix) for C ordering though.
 * @param arr the array to
 * @return the leading dimension wrt the ordering of the array
 *
 */
public static int getLd(INDArray arr) {
    //ignore ordering for vectors
    if (arr.isVector()) {
        return (int) arr.length();
    }

    return arr.ordering() == NDArrayFactory.C ? (int) arr.size(1) : (int) arr.size(0);
}