org.apache.spark.sql.catalyst.plans.logical.LogicalPlan Java Examples

The following examples show how to use org.apache.spark.sql.catalyst.plans.logical.LogicalPlan. You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: DatasetUtil.java    From jpmml-sparkml with GNU Affero General Public License v3.0 6 votes vote down vote up
static
public LogicalPlan createAnalyzedLogicalPlan(SparkSession sparkSession, StructType schema, String statement){
	String tableName = "sql2pmml_" + DatasetUtil.ID.getAndIncrement();

	statement = statement.replace("__THIS__", tableName);

	Dataset<Row> dataset = sparkSession.createDataFrame(Collections.emptyList(), schema);

	dataset.createOrReplaceTempView(tableName);

	try {
		QueryExecution queryExecution = sparkSession.sql(statement).queryExecution();

		return queryExecution.analyzed();
	} finally {
		Catalog catalog = sparkSession.catalog();

		catalog.dropTempView(tableName);
	}
}
 
Example #2
Source File: ExpressionTranslatorTest.java    From jpmml-sparkml with GNU Affero General Public License v3.0 6 votes vote down vote up
static
private Expression translateInternal(String sqlStatement){
	StructType schema = new StructType()
		.add("flag", DataTypes.BooleanType)
		.add("x1", DataTypes.DoubleType)
		.add("x2", DataTypes.DoubleType)
		.add("status", DataTypes.IntegerType);

	LogicalPlan logicalPlan = DatasetUtil.createAnalyzedLogicalPlan(ExpressionTranslatorTest.sparkSession, schema, sqlStatement);

	List<Expression> expressions = JavaConversions.seqAsJavaList(logicalPlan.expressions());
	if(expressions.size() != 1){
		throw new IllegalArgumentException();
	}

	return expressions.get(0);
}
 
Example #3
Source File: Dataset.java    From incubator-nemo with Apache License 2.0 2 votes vote down vote up
/**
 * Constructor.
 *
 * @param sparkSession spark session.
 * @param logicalPlan  spark logical plan.
 * @param encoder      spark encoder.
 */
private Dataset(final SparkSession sparkSession, final LogicalPlan logicalPlan, final Encoder<T> encoder) {
  super(sparkSession, logicalPlan, encoder);
}
 
Example #4
Source File: Dataset.java    From incubator-nemo with Apache License 2.0 2 votes vote down vote up
/**
 * Overrides super.ofRows.
 *
 * @param sparkSession Spark Session.
 * @param logicalPlan  Spark logical plan.
 * @return Dataset of the given rows.
 */
public static Dataset<Row> ofRows(final org.apache.spark.sql.SparkSession sparkSession,
                                  final org.apache.spark.sql.catalyst.plans.logical.LogicalPlan logicalPlan) {
  return from(org.apache.spark.sql.Dataset.ofRows(sparkSession, logicalPlan));
}
 
Example #5
Source File: Dataset.java    From nemo with Apache License 2.0 2 votes vote down vote up
/**
 * Constructor.
 *
 * @param sparkSession spark session.
 * @param logicalPlan  spark logical plan.
 * @param encoder      spark encoder.
 */
private Dataset(final SparkSession sparkSession, final LogicalPlan logicalPlan, final Encoder<T> encoder) {
  super(sparkSession, logicalPlan, encoder);
}
 
Example #6
Source File: Dataset.java    From nemo with Apache License 2.0 2 votes vote down vote up
/**
 * Overrides super.ofRows.
 *
 * @param sparkSession Spark Session.
 * @param logicalPlan  Spark logical plan.
 * @return Dataset of the given rows.
 */
public static Dataset<Row> ofRows(final org.apache.spark.sql.SparkSession sparkSession,
                                  final org.apache.spark.sql.catalyst.plans.logical.LogicalPlan logicalPlan) {
  return from(org.apache.spark.sql.Dataset.ofRows(sparkSession, logicalPlan));
}