org.apache.beam.sdk.options.Validation.Required Java Examples

The following examples show how to use org.apache.beam.sdk.options.Validation.Required. You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: PipelineOptionsFactory.java    From beam with Apache License 2.0 6 votes vote down vote up
/**
 * Output the requirement groups that the property is a member of, including all properties that
 * satisfy the group requirement, breaking up long lines on white space characters and attempting
 * to honor a line limit of {@code TERMINAL_WIDTH}.
 */
private static void prettyPrintRequiredGroups(
    PrintStream out,
    Required annotation,
    SortedSetMultimap<String, String> requiredGroupNameToProperties) {
  if (annotation == null || annotation.groups() == null) {
    return;
  }
  for (String group : annotation.groups()) {
    SortedSet<String> groupMembers = requiredGroupNameToProperties.get(group);
    String requirement;
    if (groupMembers.size() == 1) {
      requirement = Iterables.getOnlyElement(groupMembers) + " is required.";
    } else {
      requirement = "At least one of " + groupMembers + " is required";
    }
    terminalPrettyPrint(out, requirement.split("\\s+"));
  }
}
 
Example #2
Source File: BulkDecompressor.java    From DataflowTemplates with Apache License 2.0 5 votes vote down vote up
@Description(
    "The output file to write failures during the decompression process "
        + "(e.g. gs://bucket-name/decompressed/failed.txt). The contents will be one line for "
        + "each file which failed decompression. Note that this parameter will "
        + "allow the pipeline to continue processing in the event of a failure.")
@Required
ValueProvider<String> getOutputFailureFile();
 
Example #3
Source File: DLPTextToBigQueryStreaming.java    From DataflowTemplates with Apache License 2.0 5 votes vote down vote up
@Description(
    "DLP API has a limit for payload size of 524KB /api call. "
        + "That's why dataflow process will need to chunk it. User will have to decide "
        + "on how they would like to batch the request depending on number of rows "
        + "and how big each row is.")
@Required
ValueProvider<Integer> getBatchSize();
 
Example #4
Source File: BulkCompressor.java    From DataflowTemplates with Apache License 2.0 5 votes vote down vote up
@Description(
    "The output file to write failures during the compression process "
        + "(e.g. gs://bucket-name/compressed/failed.txt). The contents will be one line for "
        + "each file which failed compression. Note that this parameter will "
        + "allow the pipeline to continue processing in the event of a failure.")
@Required
ValueProvider<String> getOutputFailureFile();
 
Example #5
Source File: PipelineOptionsFactory.java    From beam with Apache License 2.0 5 votes vote down vote up
/**
 * Returns a map of required groups of arguments to the properties that satisfy the requirement.
 */
private static SortedSetMultimap<String, String> getRequiredGroupNamesToProperties(
    Map<String, Method> propertyNamesToGetters) {
  SortedSetMultimap<String, String> result = TreeMultimap.create();
  for (Map.Entry<String, Method> propertyEntry : propertyNamesToGetters.entrySet()) {
    Required requiredAnnotation =
        propertyEntry.getValue().getAnnotation(Validation.Required.class);
    if (requiredAnnotation != null) {
      for (String groupName : requiredAnnotation.groups()) {
        result.put(groupName, propertyEntry.getKey());
      }
    }
  }
  return result;
}
 
Example #6
Source File: ImportOptions.java    From feast with Apache License 2.0 5 votes vote down vote up
@Required
@Description(
    "JSON string representation of the Store that import job will write FeatureRow to."
        + "Store follows the format in feast.core.Store proto."
        + "Multiple Store can be passed by specifying '--store={...}' multiple times"
        + "The conversion of Proto message to JSON should follow this mapping:"
        + "https://developers.google.com/protocol-buffers/docs/proto3#json"
        + "Please minify and remove all insignificant whitespace such as newline in the JSON string"
        + "to prevent error when parsing the options")
List<String> getStoresJson();
 
Example #7
Source File: DLPTextToBigQueryStreaming.java    From dlp-dataflow-deidentification with Apache License 2.0 5 votes vote down vote up
@Description(
    "DLP API has a limit for payload size of 524KB /api call. "
        + "That's why dataflow process will need to chunk it. User will have to decide "
        + "on how they would like to batch the request depending on number of rows "
        + "and how big each row is.")
@Required
ValueProvider<Integer> getBatchSize();
 
Example #8
Source File: ImportOptions.java    From feast with Apache License 2.0 5 votes vote down vote up
@Required
@Description(
    "JSON string representation of the SpecsPipe configuration."
        + "Job will use this to know where read new FeatureSetSpec from (kafka broker & topic)"
        + "and where send acknowledgment on successful update of job's state to."
        + "SpecsStreamingUpdateConfig follows the format in feast.core.IngestionJob.SpecsStreamingUpdateConfig proto."
        + "The conversion of Proto message to JSON should follow this mapping:"
        + "https://developers.google.com/protocol-buffers/docs/proto3#json"
        + "Please minify and remove all insignificant whitespace such as newline in the JSON string"
        + "to prevent error when parsing the options")
String getSpecsStreamingUpdateConfigJson();
 
Example #9
Source File: ImportOptions.java    From feast with Apache License 2.0 5 votes vote down vote up
@Required
@Description(
    "JSON string representation of the Source that will be used to read FeatureRows from."
        + "Source follows the format in featst.core.Source proto. Currently only kafka source is supported"
        + "The conversion of Proto message to JSON should follow this mapping:"
        + "https://developers.google.com/protocol-buffers/docs/proto3#json"
        + "Please minify and remove all insignificant whitespace such as newline in the JSON string"
        + "to prevent error when parsing the options")
String getSourceJson();
 
Example #10
Source File: BigQueryLoader.java    From quetzal with Eclipse Public License 2.0 4 votes vote down vote up
/**
 * Set this option to choose a different input file or glob.
 */
@Description("Path of the file to read from")
@Required
String getInputFile();
 
Example #11
Source File: IdentifyPrivateVariants.java    From dataflow-java with Apache License 2.0 4 votes vote down vote up
@Override
@Description("The ID of the Google Genomics variant set from which this pipeline "
    + "will identify private variants.")
@Required
String getVariantSetId();
 
Example #12
Source File: BulkCompressor.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The compression algorithm to use on the matched files.")
@Required
ValueProvider<Compression> getCompression();
 
Example #13
Source File: WordCount.java    From beam with Apache License 2.0 4 votes vote down vote up
/** Set this required option to specify where to write the output. */
@Description("Path of the file to write to")
@Required
String getOutput();
 
Example #14
Source File: CallSetNamesOptions.java    From dataflow-java with Apache License 2.0 4 votes vote down vote up
@Required
@Description("The ID of the Google Genomics variant set this pipeline is accessing.")
String getVariantSetId();
 
Example #15
Source File: PubSubToGCS.java    From java-docs-samples with Apache License 2.0 4 votes vote down vote up
@Description("Path of the output file including its filename prefix.")
@Required
String getOutput();
 
Example #16
Source File: JobOptions.java    From cloud-bigtable-examples with Apache License 2.0 4 votes vote down vote up
@Required
@Description("The Google Cloud Bigtable instance ID .")
String getBigtableInstanceId();
 
Example #17
Source File: JobOptions.java    From cloud-bigtable-examples with Apache License 2.0 4 votes vote down vote up
@Required
@Description("The Cloud Bigtable table ID in the instance." )
String getBigtableTableId();
 
Example #18
Source File: PubSubToGCS.java    From java-docs-samples with Apache License 2.0 4 votes vote down vote up
@Description("The Cloud Pub/Sub topic to read from.")
@Required
String getInputTopic();
 
Example #19
Source File: DeleteVariants.java    From dataflow-java with Apache License 2.0 4 votes vote down vote up
@Description("The Cloud Storage filepath to a comma-separated or tab-separated file of variant ids. "
    + "The variant id will be retrieved from the first column.  Any other columns will be ignored. "
    + "The file should not include any header lines.")
@Required
String getInput();
 
Example #20
Source File: BulkCompressor.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The output location to write to (e.g. gs://bucket-name/compressed)")
@Required
ValueProvider<String> getOutputDirectory();
 
Example #21
Source File: BulkCompressor.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The input file pattern to read from (e.g. gs://bucket-name/uncompressed/*.gz)")
@Required
ValueProvider<String> getInputFilePattern();
 
Example #22
Source File: DLPTextToBigQueryStreaming.java    From dlp-dataflow-deidentification with Apache License 2.0 4 votes vote down vote up
@Description(
    "DLP Deidentify Template to be used for API request "
        + "(e.g.projects/{project_id}/deidentifyTemplates/{deIdTemplateId}")
@Required
ValueProvider<String> getDeidentifyTemplateName();
 
Example #23
Source File: DLPTextToBigQueryStreaming.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description(
    "DLP Deidentify Template to be used for API request "
        + "(e.g.projects/{project_id}/deidentifyTemplates/{deIdTemplateId}")
@Required
ValueProvider<String> getDeidentifyTemplateName();
 
Example #24
Source File: BigQueryLoader.java    From quetzal with Eclipse Public License 2.0 4 votes vote down vote up
/**
 * Set this required option to specify where to write the output.
 */
@Description("Path of the file to write to")
@Required
String getOutput();
 
Example #25
Source File: BulkDecompressor.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The output location to write to (e.g. gs://bucket-name/decompressed)")
@Required
ValueProvider<String> getOutputDirectory();
 
Example #26
Source File: BulkDecompressor.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The input file pattern to read from (e.g. gs://bucket-name/compressed/*.gz)")
@Required
ValueProvider<String> getInputFilePattern();
 
Example #27
Source File: PubsubToAvro.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The Avro Write Temporary Directory. Must end with /")
@Required
ValueProvider<String> getAvroTempDirectory();
 
Example #28
Source File: PubsubToAvro.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The directory to output files to. Must end with a slash.")
@Required
ValueProvider<String> getOutputDirectory();
 
Example #29
Source File: PubsubToAvro.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The Cloud Pub/Sub topic to read from.")
@Required
ValueProvider<String> getInputTopic();
 
Example #30
Source File: SubProcessPipelineOptions.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("As sub-processes can be heavy weight define the level of concurrency level")
@Required
Integer getConcurrency();