org.apache.flink.cep.nfa.sharedbuffer.EventId Java Examples

The following examples show how to use org.apache.flink.cep.nfa.sharedbuffer.EventId. You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: AfterMatchSkipStrategy.java    From flink with Apache License 2.0 6 votes vote down vote up
/**
 * Prunes matches/partial matches based on the chosen strategy.
 *
 * @param matchesToPrune current partial matches
 * @param matchedResult  already completed matches
 * @param sharedBufferAccessor   accessor to corresponding shared buffer
 * @throws Exception thrown if could not access the state
 */
public void prune(
		Collection<ComputationState> matchesToPrune,
		Collection<Map<String, List<EventId>>> matchedResult,
		SharedBufferAccessor<?> sharedBufferAccessor) throws Exception {

	EventId pruningId = getPruningId(matchedResult);
	if (pruningId != null) {
		List<ComputationState> discardStates = new ArrayList<>();
		for (ComputationState computationState : matchesToPrune) {
			if (computationState.getStartEventID() != null &&
				shouldPrune(computationState.getStartEventID(), pruningId)) {
				sharedBufferAccessor.releaseNode(computationState.getPreviousBufferEntry());
				discardStates.add(computationState);
			}
		}
		matchesToPrune.removeAll(discardStates);
	}
}
 
Example #2
Source File: AfterMatchSkipStrategy.java    From flink with Apache License 2.0 6 votes vote down vote up
/**
 * Prunes matches/partial matches based on the chosen strategy.
 *
 * @param matchesToPrune current partial matches
 * @param matchedResult  already completed matches
 * @param sharedBufferAccessor   accessor to corresponding shared buffer
 * @throws Exception thrown if could not access the state
 */
public void prune(
		Collection<ComputationState> matchesToPrune,
		Collection<Map<String, List<EventId>>> matchedResult,
		SharedBufferAccessor<?> sharedBufferAccessor) throws Exception {

	EventId pruningId = getPruningId(matchedResult);
	if (pruningId != null) {
		List<ComputationState> discardStates = new ArrayList<>();
		for (ComputationState computationState : matchesToPrune) {
			if (computationState.getStartEventID() != null &&
				shouldPrune(computationState.getStartEventID(), pruningId)) {
				sharedBufferAccessor.releaseNode(computationState.getPreviousBufferEntry());
				discardStates.add(computationState);
			}
		}
		matchesToPrune.removeAll(discardStates);
	}
}
 
Example #3
Source File: AfterMatchSkipStrategy.java    From Flink-CEPplus with Apache License 2.0 6 votes vote down vote up
/**
 * Prunes matches/partial matches based on the chosen strategy.
 *
 * @param matchesToPrune current partial matches
 * @param matchedResult  already completed matches
 * @param sharedBufferAccessor   accessor to corresponding shared buffer
 * @throws Exception thrown if could not access the state
 */
public void prune(
		Collection<ComputationState> matchesToPrune,
		Collection<Map<String, List<EventId>>> matchedResult,
		SharedBufferAccessor<?> sharedBufferAccessor) throws Exception {

	EventId pruningId = getPruningId(matchedResult);
	if (pruningId != null) {
		List<ComputationState> discardStates = new ArrayList<>();
		for (ComputationState computationState : matchesToPrune) {
			if (computationState.getStartEventID() != null &&
				shouldPrune(computationState.getStartEventID(), pruningId)) {
				sharedBufferAccessor.releaseNode(computationState.getPreviousBufferEntry());
				discardStates.add(computationState);
			}
		}
		matchesToPrune.removeAll(discardStates);
	}
}
 
Example #4
Source File: NFA.java    From flink with Apache License 2.0 6 votes vote down vote up
/**
 * Extracts all the sequences of events from the start to the given computation state. An event
 * sequence is returned as a map which contains the events and the names of the states to which
 * the events were mapped.
 *
 * @param sharedBufferAccessor The accessor to {@link SharedBuffer} from which to extract the matches
 * @param computationState The end computation state of the extracted event sequences
 * @return Collection of event sequences which end in the given computation state
 * @throws Exception Thrown if the system cannot access the state.
 */
private Map<String, List<EventId>> extractCurrentMatches(
		final SharedBufferAccessor<T> sharedBufferAccessor,
		final ComputationState computationState) throws Exception {
	if (computationState.getPreviousBufferEntry() == null) {
		return new HashMap<>();
	}

	List<Map<String, List<EventId>>> paths = sharedBufferAccessor.extractPatterns(
			computationState.getPreviousBufferEntry(),
			computationState.getVersion());

	if (paths.isEmpty()) {
		return new HashMap<>();
	}
	// for a given computation state, we cannot have more than one matching patterns.
	Preconditions.checkState(paths.size() == 1);

	return paths.get(0);
}
 
Example #5
Source File: NFA.java    From flink with Apache License 2.0 6 votes vote down vote up
/**
 * Extracts all the sequences of events from the start to the given computation state. An event
 * sequence is returned as a map which contains the events and the names of the states to which
 * the events were mapped.
 *
 * @param sharedBufferAccessor The accessor to {@link SharedBuffer} from which to extract the matches
 * @param computationState The end computation state of the extracted event sequences
 * @return Collection of event sequences which end in the given computation state
 * @throws Exception Thrown if the system cannot access the state.
 */
private Map<String, List<EventId>> extractCurrentMatches(
		final SharedBufferAccessor<T> sharedBufferAccessor,
		final ComputationState computationState) throws Exception {
	if (computationState.getPreviousBufferEntry() == null) {
		return new HashMap<>();
	}

	List<Map<String, List<EventId>>> paths = sharedBufferAccessor.extractPatterns(
			computationState.getPreviousBufferEntry(),
			computationState.getVersion());

	if (paths.isEmpty()) {
		return new HashMap<>();
	}
	// for a given computation state, we cannot have more than one matching patterns.
	Preconditions.checkState(paths.size() == 1);

	return paths.get(0);
}
 
Example #6
Source File: NFA.java    From Flink-CEPplus with Apache License 2.0 6 votes vote down vote up
/**
 * Extracts all the sequences of events from the start to the given computation state. An event
 * sequence is returned as a map which contains the events and the names of the states to which
 * the events were mapped.
 *
 * @param sharedBufferAccessor The accessor to {@link SharedBuffer} from which to extract the matches
 * @param computationState The end computation state of the extracted event sequences
 * @return Collection of event sequences which end in the given computation state
 * @throws Exception Thrown if the system cannot access the state.
 */
private Map<String, List<EventId>> extractCurrentMatches(
		final SharedBufferAccessor<T> sharedBufferAccessor,
		final ComputationState computationState) throws Exception {
	if (computationState.getPreviousBufferEntry() == null) {
		return new HashMap<>();
	}

	List<Map<String, List<EventId>>> paths = sharedBufferAccessor.extractPatterns(
			computationState.getPreviousBufferEntry(),
			computationState.getVersion());

	if (paths.isEmpty()) {
		return new HashMap<>();
	}
	// for a given computation state, we cannot have more than one matching patterns.
	Preconditions.checkState(paths.size() == 1);

	return paths.get(0);
}
 
Example #7
Source File: MigrationUtils.java    From flink with Apache License 2.0 5 votes vote down vote up
static <T> Queue<ComputationState> deserializeComputationStates(
		org.apache.flink.cep.nfa.SharedBuffer<T> sharedBuffer,
		TypeSerializer<T> eventSerializer,
		DataInputView source) throws IOException {

	Queue<ComputationState> computationStates = new LinkedList<>();
	StringSerializer stateNameSerializer = StringSerializer.INSTANCE;
	LongSerializer timestampSerializer = LongSerializer.INSTANCE;
	DeweyNumber.DeweyNumberSerializer versionSerializer = DeweyNumber.DeweyNumberSerializer.INSTANCE;

	int computationStateNo = source.readInt();
	for (int i = 0; i < computationStateNo; i++) {
		String state = stateNameSerializer.deserialize(source);
		String prevState = stateNameSerializer.deserialize(source);
		long timestamp = timestampSerializer.deserialize(source);
		DeweyNumber version = versionSerializer.deserialize(source);
		long startTimestamp = timestampSerializer.deserialize(source);
		int counter = source.readInt();

		T event = null;
		if (source.readBoolean()) {
			event = eventSerializer.deserialize(source);
		}

		NodeId nodeId;
		EventId startEventId;
		if (prevState != null) {
			nodeId = sharedBuffer.getNodeId(prevState, timestamp, counter, event);
			startEventId = sharedBuffer.getStartEventId(version.getRun());
		} else {
			nodeId = null;
			startEventId = null;
		}

		computationStates.add(ComputationState.createState(state, nodeId, version, startTimestamp, startEventId));
	}
	return computationStates;
}
 
Example #8
Source File: SkipToElementStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
@Override
protected EventId getPruningId(Collection<Map<String, List<EventId>>> match) {
	EventId pruningId = null;
	for (Map<String, List<EventId>> resultMap : match) {
		List<EventId> pruningPattern = resultMap.get(patternName);
		if (pruningPattern == null || pruningPattern.isEmpty()) {
			if (shouldThrowException) {
				throw new FlinkRuntimeException(String.format(
					"Could not skip to %s. No such element in the found match %s",
					patternName,
					resultMap));
			}
		} else {
			pruningId = max(pruningId, pruningPattern.get(getIndex(pruningPattern.size())));
		}

		if (shouldThrowException) {
			EventId startEvent = resultMap.values()
				.stream()
				.flatMap(Collection::stream)
				.min(EventId::compareTo)
				.orElseThrow(() -> new IllegalStateException("Cannot prune based on empty match"));

			if (pruningId != null && pruningId.equals(startEvent)) {
				throw new FlinkRuntimeException("Could not skip to first element of a match.");
			}
		}
	}

	return pruningId;
}
 
Example #9
Source File: SkipPastLastStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
@Override
protected EventId getPruningId(final Collection<Map<String, List<EventId>>> match) {
	EventId pruningId = null;
	for (Map<String, List<EventId>> resultMap : match) {
		for (List<EventId> eventList : resultMap.values()) {
			pruningId = max(pruningId, eventList.get(eventList.size() - 1));
		}
	}

	return pruningId;
}
 
Example #10
Source File: SharedBuffer.java    From flink with Apache License 2.0 5 votes vote down vote up
public SharedBuffer(
		Map<EventId, Lockable<V>> eventsBuffer,
		Map<NodeId, Lockable<SharedBufferNode>> pages,
		Map<Tuple2<String, ValueTimeWrapper<V>>, NodeId> mappingContext,
		Map<Integer, EventId> starters) {

	this.eventsBuffer = eventsBuffer;
	this.pages = pages;
	this.mappingContext = mappingContext;
	this.starters = starters;
}
 
Example #11
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
NFAStateSerializer(
		final TypeSerializer<DeweyNumber> versionSerializer,
		final TypeSerializer<NodeId> nodeIdSerializer,
		final TypeSerializer<EventId> eventIdSerializer) {
	this.versionSerializer = checkNotNull(versionSerializer);
	this.nodeIdSerializer = checkNotNull(nodeIdSerializer);
	this.eventIdSerializer = checkNotNull(eventIdSerializer);
}
 
Example #12
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private ComputationState deserializeSingleComputationState(DataInputView source) throws IOException {
	String stateName = StringValue.readString(source);
	NodeId prevState = nodeIdSerializer.deserialize(source);
	DeweyNumber version = versionSerializer.deserialize(source);
	long startTimestamp = source.readLong();

	EventId startEventId = deserializeStartEvent(source);

	return ComputationState.createState(stateName,
		prevState,
		version,
		startTimestamp,
		startEventId);
}
 
Example #13
Source File: NFASerializerUpgradeTest.java    From flink with Apache License 2.0 5 votes vote down vote up
@Override
public SharedBufferNode createTestData() {
	SharedBufferNode result = new SharedBufferNode();
	result.addEdge(new SharedBufferEdge(
			new NodeId(new EventId(42, 42L), "page"),
			new DeweyNumber(42)));
	return result;
}
 
Example #14
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void serializeStartEvent(EventId startEventID, DataOutputView target) throws IOException {
	if (startEventID != null) {
		target.writeByte(1);
		eventIdSerializer.serialize(startEventID, target);
	} else {
		target.writeByte(0);
	}
}
 
Example #15
Source File: SkipToNextStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
@Override
protected EventId getPruningId(final Collection<Map<String, List<EventId>>> match) {
	EventId pruningId = null;
	for (Map<String, List<EventId>> resultMap : match) {
		for (List<EventId> eventList : resultMap.values()) {
			pruningId = min(pruningId, eventList.get(0));
		}
	}

	return pruningId;
}
 
Example #16
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private EventId deserializeStartEvent(DataInputView source) throws IOException {
	byte isNull = source.readByte();
	EventId startEventId = null;
	if (isNull == 1) {
		startEventId = eventIdSerializer.deserialize(source);
	}
	return startEventId;
}
 
Example #17
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void copyStartEvent(DataInputView source, DataOutputView target) throws IOException {
	byte isNull = source.readByte();
	target.writeByte(isNull);

	if (isNull == 1) {
		EventId startEventId = eventIdSerializer.deserialize(source);
		eventIdSerializer.serialize(startEventId, target);
	}
}
 
Example #18
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
	in.defaultReadObject();

	// the nested serializer will be null if this was read from a savepoint taken with versions
	// lower than Flink 1.7; in this case, we explicitly create instance for the nested serializer.
	if (versionSerializer == null || nodeIdSerializer == null || eventIdSerializer == null) {
		this.versionSerializer = DeweyNumber.DeweyNumberSerializer.INSTANCE;
		this.eventIdSerializer = EventId.EventIdSerializer.INSTANCE;
		this.nodeIdSerializer = new NodeId.NodeIdSerializer();
	}
}
 
Example #19
Source File: ComputationState.java    From flink with Apache License 2.0 5 votes vote down vote up
private ComputationState(
		final String currentState,
		@Nullable final NodeId previousBufferEntry,
		final DeweyNumber version,
		@Nullable final EventId startEventID,
		final long startTimestamp) {
	this.currentStateName = currentState;
	this.version = version;
	this.startTimestamp = startTimestamp;
	this.previousBufferEntry = previousBufferEntry;
	this.startEventID = startEventID;
}
 
Example #20
Source File: AfterMatchSkipStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
static EventId max(EventId o1, EventId o2) {
	if (o2 == null) {
		return o1;
	}

	if (o1 == null) {
		return o2;
	}

	if (o1.compareTo(o2) >= 0) {
		return o1;
	} else {
		return o2;
	}
}
 
Example #21
Source File: SkipPastLastStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
@Override
protected EventId getPruningId(final Collection<Map<String, List<EventId>>> match) {
	EventId pruningId = null;
	for (Map<String, List<EventId>> resultMap : match) {
		for (List<EventId> eventList : resultMap.values()) {
			pruningId = max(pruningId, eventList.get(eventList.size() - 1));
		}
	}

	return pruningId;
}
 
Example #22
Source File: AfterMatchSkipStrategy.java    From flink with Apache License 2.0 5 votes vote down vote up
static EventId min(EventId o1, EventId o2) {
	if (o2 == null) {
		return o1;
	}

	if (o1 == null) {
		return o2;
	}

	if (o1.compareTo(o2) <= 0) {
		return o1;
	} else {
		return o2;
	}
}
 
Example #23
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
	in.defaultReadObject();

	// the nested serializer will be null if this was read from a savepoint taken with versions
	// lower than Flink 1.7; in this case, we explicitly create instance for the nested serializer.
	if (versionSerializer == null || nodeIdSerializer == null || eventIdSerializer == null) {
		this.versionSerializer = DeweyNumber.DeweyNumberSerializer.INSTANCE;
		this.eventIdSerializer = EventId.EventIdSerializer.INSTANCE;
		this.nodeIdSerializer = new NodeId.NodeIdSerializer();
	}
}
 
Example #24
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void copyStartEvent(DataInputView source, DataOutputView target) throws IOException {
	byte isNull = source.readByte();
	target.writeByte(isNull);

	if (isNull == 1) {
		EventId startEventId = eventIdSerializer.deserialize(source);
		eventIdSerializer.serialize(startEventId, target);
	}
}
 
Example #25
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private EventId deserializeStartEvent(DataInputView source) throws IOException {
	byte isNull = source.readByte();
	EventId startEventId = null;
	if (isNull == 1) {
		startEventId = eventIdSerializer.deserialize(source);
	}
	return startEventId;
}
 
Example #26
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private void serializeStartEvent(EventId startEventID, DataOutputView target) throws IOException {
	if (startEventID != null) {
		target.writeByte(1);
		eventIdSerializer.serialize(startEventID, target);
	} else {
		target.writeByte(0);
	}
}
 
Example #27
Source File: NFAStateSerializer.java    From flink with Apache License 2.0 5 votes vote down vote up
private ComputationState deserializeSingleComputationState(DataInputView source) throws IOException {
	String stateName = StringValue.readString(source);
	NodeId prevState = nodeIdSerializer.deserialize(source);
	DeweyNumber version = versionSerializer.deserialize(source);
	long startTimestamp = source.readLong();

	EventId startEventId = deserializeStartEvent(source);

	return ComputationState.createState(stateName,
		prevState,
		version,
		startTimestamp,
		startEventId);
}
 
Example #28
Source File: NFA.java    From flink with Apache License 2.0 5 votes vote down vote up
private void addComputationState(
		SharedBufferAccessor<T> sharedBufferAccessor,
		List<ComputationState> computationStates,
		State<T> currentState,
		NodeId previousEntry,
		DeweyNumber version,
		long startTimestamp,
		EventId startEventId) throws Exception {
	ComputationState computationState = ComputationState.createState(
			currentState.getName(), previousEntry, version, startTimestamp, startEventId);
	computationStates.add(computationState);

	sharedBufferAccessor.lockNode(previousEntry);
}
 
Example #29
Source File: NFA.java    From flink with Apache License 2.0 5 votes vote down vote up
EventId getEventId() throws Exception {
	if (eventId == null) {
		this.eventId = sharedBufferAccessor.registerEvent(event, timestamp);
	}

	return eventId;
}
 
Example #30
Source File: NFA.java    From Flink-CEPplus with Apache License 2.0 5 votes vote down vote up
EventId getEventId() throws Exception {
	if (eventId == null) {
		this.eventId = sharedBufferAccessor.registerEvent(event, timestamp);
	}

	return eventId;
}