Java Code Examples for org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator#setParallelism()

The following examples show how to use org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator#setParallelism() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: CsvTableSink.java    From flink with Apache License 2.0 6 votes vote down vote up
@Override
public DataStreamSink<?> consumeDataStream(DataStream<Row> dataStream) {
	SingleOutputStreamOperator<String> csvRows =
		dataStream.map(new CsvFormatter(fieldDelim == null ? "," : fieldDelim));

	DataStreamSink<String> sink;
	if (writeMode != null) {
		sink = csvRows.writeAsText(path, writeMode);
	} else {
		sink = csvRows.writeAsText(path);
	}

	if (numFiles > 0) {
		csvRows.setParallelism(numFiles);
		sink.setParallelism(numFiles);
	} else {
		// if file number is not set, use input parallelism to make it chained.
		csvRows.setParallelism(dataStream.getParallelism());
		sink.setParallelism(dataStream.getParallelism());
	}

	sink.name(TableConnectorUtils.generateRuntimeName(CsvTableSink.class, fieldNames));

	return sink;
}
 
Example 2
Source File: FlinkTopology.java    From incubator-samoa with Apache License 2.0 6 votes vote down vote up
private void initializeCycle(int cycleID) {
    //get the head and tail of cycle
    FlinkProcessingItem tail = cycles.get(cycleID).get(0);
    FlinkProcessingItem head = cycles.get(cycleID).get(cycles.get(cycleID).size() - 1);

    //initialise source stream of the iteration, so as to use it for the iteration starting point
    if (!head.isInitialised()) {
        head.setOnIteration(true);
        head.initialise();
        head.initialiseStreams();
    }

    //initialise all nodes after head
    for (int node = cycles.get(cycleID).size() - 2; node >= 0; node--) {
        FlinkProcessingItem processingItem = cycles.get(cycleID).get(node);
        processingItem.initialise();
        processingItem.initialiseStreams();
    }

    SingleOutputStreamOperator backedge = (SingleOutputStreamOperator) head.getInputStreamBySourceID(tail.getComponentId()).getOutStream();
    backedge.setParallelism(head.getParallelism());
    ((IterativeStream) head.getDataStream()).closeWith(backedge);
}
 
Example 3
Source File: CsvTableSink.java    From flink with Apache License 2.0 6 votes vote down vote up
@Override
public DataStreamSink<?> consumeDataStream(DataStream<Row> dataStream) {
	SingleOutputStreamOperator<String> csvRows =
		dataStream.map(new CsvFormatter(fieldDelim == null ? "," : fieldDelim));

	DataStreamSink<String> sink;
	if (writeMode != null) {
		sink = csvRows.writeAsText(path, writeMode);
	} else {
		sink = csvRows.writeAsText(path);
	}

	if (numFiles > 0) {
		csvRows.setParallelism(numFiles);
		sink.setParallelism(numFiles);
	} else {
		// if file number is not set, use input parallelism to make it chained.
		csvRows.setParallelism(dataStream.getParallelism());
		sink.setParallelism(dataStream.getParallelism());
	}

	sink.name(TableConnectorUtils.generateRuntimeName(CsvTableSink.class, fieldNames));

	return sink;
}