Java Code Examples for org.apache.arrow.vector.FieldVector#getTransferPair()

The following examples show how to use org.apache.arrow.vector.FieldVector#getTransferPair() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: VectorContainer.java    From dremio-oss with Apache License 2.0 5 votes vote down vote up
public static void transferFromRoot(VectorSchemaRoot root, VectorContainer container, BufferAllocator allocator) {
  container.clear();
  // iterate over and transfer columns
  for (FieldVector fv : root.getFieldVectors()) {
    final TransferPair tp = fv.getTransferPair(allocator);
    tp.transfer();
    container.add(tp.getTo());
  }

  container.setRecordCount(root.getRowCount());
  container.addSchema(root.getSchema());
  container.buildSchema();
}
 
Example 2
Source File: ArrowResultChunk.java    From snowflake-jdbc with Apache License 2.0 5 votes vote down vote up
/**
 * Read an inputStream of arrow data bytes and load them into java vectors
 * of value.
 * Note, there is no copy of data involved once data is loaded into memory.
 * a.k.a ArrowStreamReader originally allocates the memory to hold vectors,
 * but those memory ownership is transfer into ArrowResultChunk class and once
 * ArrowStreamReader is garbage collected, memory will not be cleared up
 *
 * @param is inputStream which contains arrow data file in bytes
 * @throws IOException if failed to read data as arrow file
 */
public void readArrowStream(InputStream is)
throws IOException
{
  ArrayList<ValueVector> valueVectors = new ArrayList<>();
  try (ArrowStreamReader reader = new ArrowStreamReader(is, rootAllocator))
  {
    root = reader.getVectorSchemaRoot();
    while (reader.loadNextBatch())
    {
      valueVectors = new ArrayList<>();

      for (FieldVector f : root.getFieldVectors())
      {
        // transfer will not copy data but transfer ownership of memory
        // from streamReader to resultChunk
        TransferPair t = f.getTransferPair(rootAllocator);
        t.transfer();
        valueVectors.add(t.getTo());
      }

      addBatchData(valueVectors);
      root.clear();
    }
  }
  catch (ClosedByInterruptException cbie)
  {
    // happens when the statement is closed before finish parsing
    logger.debug("Interrupted when loading Arrow result", cbie);
    valueVectors.forEach(ValueVector::close);
    freeData();
  }
  catch (Exception ex)
  {
    valueVectors.forEach(ValueVector::close);
    freeData();
    throw ex;
  }
}