Java Code Examples for org.apache.spark.api.java.JavaSparkContext#setLocalProperty()

The following examples show how to use org.apache.spark.api.java.JavaSparkContext#setLocalProperty() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: GraknSparkComputer.java    From grakn with GNU Affero General Public License v3.0 6 votes vote down vote up
/**
 * When using a persistent context the running Context's configuration will override a passed
 * in configuration. Spark allows us to override these inherited properties via
 * SparkContext.setLocalProperty
 */
private static void updateLocalConfiguration(JavaSparkContext sparkContext, Configuration configuration) {
    /*
     * While we could enumerate over the entire SparkConfiguration and copy into the Thread
     * Local properties of the Spark Context this could cause adverse effects with future
     * versions of Spark. Since the api for setting multiple local properties at once is
     * restricted as private, we will only set those properties we know can effect SparkGraphComputer
     * Execution rather than applying the entire configuration.
     */

    String[] validPropertyNames = {
            "spark.job.description",
            "spark.jobGroup.id",
            "spark.job.interruptOnCancel",
            "spark.scheduler.pool"
    };

    for (String propertyName : validPropertyNames) {
        String propertyValue = configuration.get(propertyName);
        if (propertyValue != null) {
            LOGGER.info("Setting Thread Local SparkContext Property - "
                    + propertyName + " : " + propertyValue);
            sparkContext.setLocalProperty(propertyName, configuration.get(propertyName));
        }
    }
}
 
Example 2
Source File: SparkGraphComputer.java    From tinkerpop with Apache License 2.0 6 votes vote down vote up
/**
 * When using a persistent context the running Context's configuration will override a passed
 * in configuration. Spark allows us to override these inherited properties via
 * SparkContext.setLocalProperty
 */
private void updateLocalConfiguration(final JavaSparkContext sparkContext, final Configuration configuration) {
    /*
     * While we could enumerate over the entire SparkConfiguration and copy into the Thread
     * Local properties of the Spark Context this could cause adverse effects with future
     * versions of Spark. Since the api for setting multiple local properties at once is
     * restricted as private, we will only set those properties we know can effect SparkGraphComputer
     * Execution rather than applying the entire configuration.
     */
    final String[] validPropertyNames = {
            "spark.job.description",
            "spark.jobGroup.id",
            "spark.job.interruptOnCancel",
            "spark.scheduler.pool"
    };

    for (String propertyName : validPropertyNames) {
        String propertyValue = configuration.get(propertyName);
        if (propertyValue != null) {
            this.logger.info("Setting Thread Local SparkContext Property - "
                    + propertyName + " : " + propertyValue);
            sparkContext.setLocalProperty(propertyName, configuration.get(propertyName));
        }
    }
}
 
Example 3
Source File: SpliceSpark.java    From spliceengine with GNU Affero General Public License v3.0 5 votes vote down vote up
public static void pushScope(String displayString) {
    JavaSparkContext jspc = SpliceSpark.getContext();
    jspc.setCallSite(displayString);
    jspc.setLocalProperty(OLD_SCOPE_KEY,jspc.getLocalProperty(SCOPE_KEY));
    jspc.setLocalProperty(OLD_SCOPE_OVERRIDE,jspc.getLocalProperty(SCOPE_OVERRIDE));
    jspc.setLocalProperty(SCOPE_KEY,new RDDOperationScope(displayString, null, RDDOperationScope.nextScopeId()+"").toJson());
    jspc.setLocalProperty(SCOPE_OVERRIDE,"true");
}