public class Aggregate extends SparkPlan implements scala.Product, scala.Serializable
groupingExpressions and computes the aggregateExpressions for each
group.
| Modifier and Type | Class and Description |
|---|---|
class |
Aggregate.ComputedAggregate
An aggregate that needs to be computed for each row in a group.
|
class |
Aggregate.ComputedAggregate$ |
| Constructor and Description |
|---|
Aggregate(boolean partial,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions,
SparkPlan child) |
| Modifier and Type | Method and Description |
|---|---|
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> |
aggregateExpressions() |
SparkPlan |
child() |
RDD<org.apache.spark.sql.catalyst.expressions.Row> |
execute()
Substituted version of aggregateExpressions expressions which are used to compute final
output rows given a group and the result of all aggregate computations.
|
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> |
groupingExpressions() |
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> |
output() |
org.apache.spark.sql.catalyst.plans.physical.Partitioning |
outputPartitioning() |
boolean |
partial() |
scala.collection.Seq<org.apache.spark.sql.catalyst.plans.physical.Distribution> |
requiredChildDistribution()
Specifies any partition requirements on the input data for this operator.
|
codegenEnabled, executeCollect, makeCopy, outputPartitioningexpressions, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionDown$1, org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionUp$1, outputSet, printSchema, schema, schemaString, transformAllExpressions, transformExpressions, transformExpressionsDown, transformExpressionsUpapply, argString, asCode, children, collect, fastEquals, flatMap, foreach, generateTreeString, getNodeNumbered, id, map, mapChildren, nextId, nodeName, numberedTreeString, otherCopyArgs, sameInstance, simpleString, stringArgs, toString, transform, transformChildrenDown, transformChildrenUp, transformDown, transformUp, treeString, withNewChildrenproductArity, productElement, productIterator, productPrefixinitialized, initializeIfNecessary, initializeLogging, initLock, isTraceEnabled, log_, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarningpublic Aggregate(boolean partial,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions,
SparkPlan child)
public boolean partial()
public scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> groupingExpressions()
public scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.NamedExpression> aggregateExpressions()
public SparkPlan child()
public scala.collection.Seq<org.apache.spark.sql.catalyst.plans.physical.Distribution> requiredChildDistribution()
SparkPlanrequiredChildDistribution in class SparkPlanpublic scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> output()
output in class org.apache.spark.sql.catalyst.plans.QueryPlan<SparkPlan>public RDD<org.apache.spark.sql.catalyst.expressions.Row> execute()
public org.apache.spark.sql.catalyst.plans.physical.Partitioning outputPartitioning()