Java Code Examples for org.apache.beam.sdk.options.Default#String

The following examples show how to use org.apache.beam.sdk.options.Default#String . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: CsvConverters.java    From DataflowTemplates with Apache License 2.0 5 votes vote down vote up
@Description(
    "Csv format according to Apache Commons CSV format. Default is: Apache Commons CSV"
        + " default\n"
        + "https://static.javadoc.io/org.apache.commons/commons-csv/1.7/org/apache/commons/csv/CSVFormat.html#DEFAULT\n"
        + "Must match format names exactly found at: "
        + "https://static.javadoc.io/org.apache.commons/commons-csv/1.7/org/apache/commons/csv/CSVFormat.Predefined.html")
@Default.String("Default")
String getCsvFormat();
 
Example 2
Source File: HourlyTeamScore.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description(
    "String representation of the first minute after which to generate results,"
        + "in the format: yyyy-MM-dd-HH-mm . This time should be in PST."
        + "Any input data timestamped prior to that minute won't be included in the sums.")
@Default.String("1970-01-01-00-00")
String getStartMin();
 
Example 3
Source File: DebuggingWordCount.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description(
    "Regex filter pattern to use in DebuggingWordCount. "
        + "Only words matching this pattern will be counted.")
@Default.String("Flourish|stomach")
String getFilterPattern();
 
Example 4
Source File: KuduIOIT.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("Kudu table")
@Default.String("beam-integration-test")
String getKuduTable();
 
Example 5
Source File: HCatalogIOIT.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("HCatalog hive database")
@Default.String("default")
String getHCatalogHiveDatabaseName();
 
Example 6
Source File: HelloWorldRead.java    From java-docs-samples with Apache License 2.0 4 votes vote down vote up
@Description("The Bigtable table ID in the instance.")
@Default.String("bigtable-table")
String getBigtableTableId();
 
Example 7
Source File: Snippets.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("My option")
@Default.String("Hello world!")
ValueProvider<String> getStringValue();
 
Example 8
Source File: BigQueryTimePartitioningClusteringIT.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("Table to read from, specified as " + "<project_id>:<dataset_id>.<table_id>")
@Default.String(WEATHER_SAMPLES_TABLE)
String getBqcInput();
 
Example 9
Source File: JdbcExportPipelineOptions.java    From dbeam with Apache License 2.0 4 votes vote down vote up
@Default.String("P7D")
@Description(
    "Export timeout, after this duration the job is cancelled and the export terminated.")
String getExportTimeout();
 
Example 10
Source File: TrafficRoutes.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("Path of the file to read from")
@Default.String(
    "gs://apache-beam-samples/traffic_sensor/"
        + "Freeways-5Minaa2010-01-01_to_2010-02-15_test2.csv")
String getInputFile();
 
Example 11
Source File: NexmarkOptions.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("Base name of pubsub topic to publish to in streaming mode.")
@Nullable
@Default.String("nexmark")
String getPubsubTopic();
 
Example 12
Source File: CassandraToBigtable.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("Default Column Family")
@Default.String("default")
ValueProvider<String> getDefaultColumnFamily();
 
Example 13
Source File: AnnotateVariants.java    From dataflow-java with Apache License 2.0 4 votes vote down vote up
@Description("The IDs of the Google Genomics variant annotation sets this pipeline is working "
    + "with, comma delimited. Defaults to ClinVar (GRCh37).")
@Default.String("CILSqfjtlY6tHxC0nNH-4cu-xlQ")
String getVariantAnnotationSetIds();
 
Example 14
Source File: BigQueryCommonOptions.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description(
    "Write disposition to use for BigQuery. "
        + "Default: WRITE_APPEND")
@Default.String("WRITE_APPEND")
String getWriteDisposition();
 
Example 15
Source File: SpannerReadIT.java    From beam with Apache License 2.0 4 votes vote down vote up
@Description("Instance ID to write to in Spanner")
@Default.String("beam-test")
String getInstanceId();
 
Example 16
Source File: ReadData.java    From java-docs-samples with Apache License 2.0 4 votes vote down vote up
@Description("The Bigtable instance ID")
@Default.String("bigtable-instance")
String getBigtableInstanceId();
 
Example 17
Source File: PubsubToAvro.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description(
    "The shard template of the output file. Specified as repeating sequences "
        + "of the letters 'S' or 'N' (example: SSS-NNN). These are replaced with the "
        + "shard number, or number of shards respectively")
@Default.String("W-P-SS-of-NN")
ValueProvider<String> getOutputShardTemplate();
 
Example 18
Source File: BQETLOptions.java    From bigquery-etl-dataflow-sample with Apache License 2.0 4 votes vote down vote up
@Description("Location of aritst credit name json.")
@Default.String("gs://mb-data")
String getLoadingBucketURL();
 
Example 19
Source File: BigQueryMergeValidatorTemplate.java    From DataflowTemplates with Apache License 2.0 4 votes vote down vote up
@Description("The table to merge data into.")
@Default.String("")
String getReplicaTable();
 
Example 20
Source File: FlinkPipelineOptions.java    From beam with Apache License 2.0 3 votes vote down vote up
/**
 * The url of the Flink JobManager on which to execute pipelines. This can either be the the
 * address of a cluster JobManager, in the form "host:port" or one of the special Strings
 * "[local]", "[collection]" or "[auto]". "[local]" will start a local Flink Cluster in the JVM,
 * "[collection]" will execute the pipeline on Java Collections while "[auto]" will let the system
 * decide where to execute the pipeline based on the environment.
 */
@Description(
    "Address of the Flink Master where the Pipeline should be executed. Can"
        + " either be of the form \"host:port\" or one of the special values [local], "
        + "[collection] or [auto].")
@Default.String(AUTO)
String getFlinkMaster();