org.apache.spark.sql.sources.v2.reader.DataReader Java Examples

The following examples show how to use org.apache.spark.sql.sources.v2.reader.DataReader. You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: HiveWarehouseDataReaderFactory.java    From spark-llap with Apache License 2.0 6 votes vote down vote up
@Override
public DataReader<ColumnarBatch> createDataReader() {
    LlapInputSplit llapInputSplit = new LlapInputSplit();
    ByteArrayInputStream splitByteArrayStream = new ByteArrayInputStream(splitBytes);
    ByteArrayInputStream confByteArrayStream = new ByteArrayInputStream(confBytes);
    JobConf conf = new JobConf();

    try(DataInputStream splitByteData = new DataInputStream(splitByteArrayStream);
        DataInputStream confByteData = new DataInputStream(confByteArrayStream)) {
        llapInputSplit.readFields(splitByteData);
        conf.readFields(confByteData);
        return getDataReader(llapInputSplit, conf, arrowAllocatorMax);
    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}
 
Example #2
Source File: SimpleRowDataSource.java    From spark-data-sources with MIT License 5 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    log.info("Factory creating reader for [" + _host + ":" + _port + "]" );
    try {
        return new TaskDataReader(_host, _port);
    } catch (UnknownTableException ute) {
        throw new RuntimeException(ute);
    }
}
 
Example #3
Source File: ParallelRowReadWriteDataSource.java    From spark-data-sources with MIT License 5 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    log.info("Factory creating reader for [" + _host + ":" + _port + "]" );
    try {
        return new TaskDataReader(_host, _port, _table, _schema, _split);
    } catch (UnknownTableException ute) {
        throw new RuntimeException(ute);
    }
}
 
Example #4
Source File: FlexibleRowDataSource.java    From spark-data-sources with MIT License 5 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    log.info("Factory creating reader for [" + _host + ":" + _port + "]" );
    try {
        return new TaskDataReader(_host, _port, _table, _schema);
    } catch (UnknownTableException ute) {
        throw new RuntimeException(ute);
    }
}
 
Example #5
Source File: ParallelRowDataSource.java    From spark-data-sources with MIT License 5 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    log.info("Factory creating reader for [" + _host + ":" + _port + "]" );
    try {
        return new TaskDataReader(_host, _port, _table, _schema, _split);
    } catch (UnknownTableException ute) {
        throw new RuntimeException(ute);
    }
}
 
Example #6
Source File: PartitioningRowDataSource.java    From spark-data-sources with MIT License 5 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    log.info("Factory creating reader for [" + _host + ":" + _port + "]" );
    try {
        return new TaskDataReader(_host, _port, _table, _schema, _split);
    } catch (UnknownTableException ute) {
        throw new RuntimeException(ute);
    }
}
 
Example #7
Source File: MockHiveWarehouseConnector.java    From spark-llap with Apache License 2.0 5 votes vote down vote up
@Override
public DataReader<ColumnarBatch> createDataReader() {
  try {
    return getDataReader(null, new JobConf(), Long.MAX_VALUE);
  } catch (Exception e) {
    throw new RuntimeException(e);
  }
}
 
Example #8
Source File: Reader.java    From iceberg with Apache License 2.0 4 votes vote down vote up
@Override
public DataReader<UnsafeRow> createDataReader() {
  return new TaskDataReader(task, lazyTableSchema(), lazyExpectedSchema(), conf.value());
}
 
Example #9
Source File: CountDataReaderFactory.java    From spark-llap with Apache License 2.0 4 votes vote down vote up
@Override
public DataReader<ColumnarBatch> createDataReader() {
  return new CountDataReader(numRows);
}
 
Example #10
Source File: HiveWarehouseDataReaderFactory.java    From spark-llap with Apache License 2.0 4 votes vote down vote up
protected DataReader<ColumnarBatch> getDataReader(LlapInputSplit split, JobConf jobConf, long arrowAllocatorMax)
    throws Exception {
    return new HiveWarehouseDataReader(split, jobConf, arrowAllocatorMax);
}
 
Example #11
Source File: SimpleMockConnector.java    From spark-llap with Apache License 2.0 4 votes vote down vote up
@Override
public DataReader<Row> createDataReader() {
    return new SimpleMockDataReader();
}
 
Example #12
Source File: MockHiveWarehouseConnector.java    From spark-llap with Apache License 2.0 4 votes vote down vote up
@Override
protected DataReader<ColumnarBatch> getDataReader(LlapInputSplit split, JobConf jobConf, long arrowAllocatorMax)
    throws Exception {
  return new MockHiveWarehouseDataReader(split, jobConf, arrowAllocatorMax);
}