org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader Java Examples
The following examples show how to use
org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: HadoopFormatIOSequenceFileTest.java From beam with Apache License 2.0 | 6 votes |
private Stream<KV<Text, LongWritable>> extractResultsFromFile(String fileName) { try (SequenceFileRecordReader<Text, LongWritable> reader = new SequenceFileRecordReader<>()) { Path path = new Path(fileName); TaskAttemptContext taskContext = HadoopFormats.createTaskAttemptContext(new Configuration(), new JobID("readJob", 0), 0); reader.initialize( new FileSplit(path, 0L, Long.MAX_VALUE, new String[] {"localhost"}), taskContext); List<KV<Text, LongWritable>> result = new ArrayList<>(); while (reader.nextKeyValue()) { result.add( KV.of( new Text(reader.getCurrentKey().toString()), new LongWritable(reader.getCurrentValue().get()))); } return result.stream(); } catch (Exception e) { throw new RuntimeException(e); } }
Example #2
Source File: WritableValueInputFormat.java From kangaroo with Apache License 2.0 | 5 votes |
@Override public RecordReader<NullWritable, V> createRecordReader(final InputSplit split, final TaskAttemptContext context) throws IOException, InterruptedException { final SequenceFileRecordReader<NullWritable, V> reader = new SequenceFileRecordReader<NullWritable, V>(); reader.initialize(split, context); return reader; }
Example #3
Source File: CombineShimRecordReader.java From aliyun-maxcompute-data-collectors with Apache License 2.0 | 5 votes |
/** * Actually instantiate the user's chosen RecordReader implementation. */ @SuppressWarnings("unchecked") private void createChildReader() throws IOException, InterruptedException { LOG.debug("ChildSplit operates on: " + split.getPath(index)); Configuration conf = context.getConfiguration(); // Determine the file format we're reading. Class rrClass; if (ExportJobBase.isSequenceFiles(conf, split.getPath(index))) { rrClass = SequenceFileRecordReader.class; } else { rrClass = LineRecordReader.class; } // Create the appropriate record reader. this.rr = (RecordReader<LongWritable, Object>) ReflectionUtils.newInstance(rrClass, conf); }
Example #4
Source File: DynamicInputChunk.java From hadoop with Apache License 2.0 | 5 votes |
private void openForRead(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException { reader = new SequenceFileRecordReader<K, V>(); reader.initialize(new FileSplit(chunkFilePath, 0, DistCpUtils.getFileSize(chunkFilePath, configuration), null), taskAttemptContext); }
Example #5
Source File: GenerateDistCacheData.java From hadoop with Apache License 2.0 | 5 votes |
/** * Returns a reader for this split of the distributed cache file list. */ @Override public RecordReader<LongWritable, BytesWritable> createRecordReader( InputSplit split, final TaskAttemptContext taskContext) throws IOException, InterruptedException { return new SequenceFileRecordReader<LongWritable, BytesWritable>(); }
Example #6
Source File: DynamicInputChunk.java From big-c with Apache License 2.0 | 5 votes |
private void openForRead(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException { reader = new SequenceFileRecordReader<K, V>(); reader.initialize(new FileSplit(chunkFilePath, 0, DistCpUtils.getFileSize(chunkFilePath, configuration), null), taskAttemptContext); }
Example #7
Source File: GenerateDistCacheData.java From big-c with Apache License 2.0 | 5 votes |
/** * Returns a reader for this split of the distributed cache file list. */ @Override public RecordReader<LongWritable, BytesWritable> createRecordReader( InputSplit split, final TaskAttemptContext taskContext) throws IOException, InterruptedException { return new SequenceFileRecordReader<LongWritable, BytesWritable>(); }
Example #8
Source File: DynamicInputChunk.java From circus-train with Apache License 2.0 | 4 votes |
private void openForRead(TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException { reader = new SequenceFileRecordReader<>(); reader .initialize(new FileSplit(chunkFilePath, 0, getFileSize(chunkFilePath, configuration), null), taskAttemptContext); }
Example #9
Source File: S3SequenceFileInputFormat.java From kangaroo with Apache License 2.0 | 4 votes |
@Override public RecordReader<K, V> createRecordReader(InputSplit split, TaskAttemptContext context) throws IOException { return new SequenceFileRecordReader<K, V>(); }
Example #10
Source File: SequenceFileLoader.java From spork with Apache License 2.0 | 4 votes |
@SuppressWarnings("unchecked") @Override public void prepareToRead(RecordReader reader, PigSplit split) throws IOException { this.reader = (SequenceFileRecordReader) reader; }
Example #11
Source File: SequenceFileStockLoader.java From hiped2 with Apache License 2.0 | 4 votes |
@SuppressWarnings("unchecked") @Override public void prepareToRead(RecordReader reader, PigSplit split) throws IOException { this.reader = (SequenceFileRecordReader) reader; }
Example #12
Source File: DynamicInputChunk.java From big-c with Apache License 2.0 | 4 votes |
/** * Getter for the record-reader, opened to the chunk-file. * @return Opened Sequence-file reader. */ public SequenceFileRecordReader<K,V> getReader() { assert reader != null : "Reader un-initialized!"; return reader; }
Example #13
Source File: DynamicInputChunk.java From hadoop with Apache License 2.0 | 4 votes |
/** * Getter for the record-reader, opened to the chunk-file. * @return Opened Sequence-file reader. */ public SequenceFileRecordReader<K,V> getReader() { assert reader != null : "Reader un-initialized!"; return reader; }
Example #14
Source File: UniformSizeInputFormat.java From big-c with Apache License 2.0 | 3 votes |
/** * Implementation of InputFormat::createRecordReader(). * @param split The split for which the RecordReader is sought. * @param context The context of the current task-attempt. * @return A SequenceFileRecordReader instance, (since the copy-listing is a * simple sequence-file.) * @throws IOException * @throws InterruptedException */ @Override public RecordReader<Text, CopyListingFileStatus> createRecordReader( InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException { return new SequenceFileRecordReader<Text, CopyListingFileStatus>(); }
Example #15
Source File: UniformSizeInputFormat.java From hadoop with Apache License 2.0 | 3 votes |
/** * Implementation of InputFormat::createRecordReader(). * @param split The split for which the RecordReader is sought. * @param context The context of the current task-attempt. * @return A SequenceFileRecordReader instance, (since the copy-listing is a * simple sequence-file.) * @throws IOException * @throws InterruptedException */ @Override public RecordReader<Text, CopyListingFileStatus> createRecordReader( InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException { return new SequenceFileRecordReader<Text, CopyListingFileStatus>(); }
Example #16
Source File: DynamicInputChunk.java From circus-train with Apache License 2.0 | 2 votes |
/** * Getter for the record-reader, opened to the chunk-file. * * @return Opened Sequence-file reader. */ public SequenceFileRecordReader<K, V> getReader() { assert reader != null : "Reader un-initialized!"; return reader; }
Example #17
Source File: UniformSizeInputFormat.java From circus-train with Apache License 2.0 | 2 votes |
/** * Implementation of InputFormat::createRecordReader(). * * @param split The split for which the RecordReader is sought. * @param context The context of the current task-attempt. * @return A SequenceFileRecordReader instance, (since the copy-listing is a simple sequence-file.) * @throws IOException * @throws InterruptedException */ @Override public RecordReader<Text, CopyListingFileStatus> createRecordReader(InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException { return new SequenceFileRecordReader<>(); }