Interface | Description |
---|---|
AccumulableParam<R,T> | Deprecated
use AccumulatorV2.
|
AccumulatorParam<T> | Deprecated
use AccumulatorV2.
|
CleanupTask |
Classes that represent cleaning tasks.
|
ExecutorPlugin |
A plugin which can be automatically instantiated within each Spark executor.
|
FutureAction<T> |
A future for the result of an action to support cancellation.
|
JobSubmitter |
Handle via which a "run" function passed to a
ComplexFutureAction
can submit jobs for execution. |
Partition |
An identifier for a partition in an RDD.
|
SparkExecutorInfo |
Exposes information about Spark Executors.
|
SparkJobInfo |
Exposes information about Spark Jobs.
|
SparkStageInfo |
Exposes information about Spark Stages.
|
TaskEndReason | Developer API
Various possible reasons why a task ended.
|
TaskFailedReason | Developer API
Various possible reasons why a task failed.
|
Class | Description |
---|---|
Accumulable<R,T> | Deprecated
use AccumulatorV2.
|
Accumulator<T> | Deprecated
use AccumulatorV2.
|
AccumulatorParam.DoubleAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
AccumulatorParam.FloatAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
AccumulatorParam.IntAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
AccumulatorParam.LongAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
AccumulatorParam.StringAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
Aggregator<K,V,C> | Developer API
A set of functions used to aggregate data.
|
BarrierTaskContext | Experimental
A
TaskContext with extra contextual info and tooling for tasks in a barrier stage. |
BarrierTaskInfo | Experimental
Carries all task infos of a barrier task.
|
CleanAccum | |
CleanBroadcast | |
CleanCheckpoint | |
CleanRDD | |
CleanShuffle | |
CleanupTaskWeakReference |
A WeakReference associated with a CleanupTask.
|
ComplexFutureAction<T> |
A
FutureAction for actions that could trigger multiple Spark jobs. |
ContextBarrierId |
For each barrier stage attempt, only at most one barrier() call can be active at any time, thus
we can use (stageId, stageAttemptId) to identify the stage attempt where the barrier() call is
from.
|
Dependency<T> | Developer API
Base class for dependencies.
|
ExceptionFailure | Developer API
Task failed due to a runtime exception.
|
ExecutorLostFailure | Developer API
The task failed because the executor that it was running on was lost.
|
ExecutorRegistered | |
ExecutorRemoved | |
ExpireDeadHosts | |
FetchFailed | Developer API
Task failed to fetch shuffle data from a remote node.
|
HashPartitioner |
A
Partitioner that implements hash-based partitioning using
Java's Object.hashCode . |
InternalAccumulator |
A collection of fields and methods concerned with internal accumulators that represent
task level metrics.
|
InternalAccumulator.input$ | |
InternalAccumulator.output$ | |
InternalAccumulator.shuffleRead$ | |
InternalAccumulator.shuffleWrite$ | |
InterruptibleIterator<T> | Developer API
An iterator that wraps around an existing iterator to provide task killing functionality.
|
NarrowDependency<T> | Developer API
Base class for dependencies where each partition of the child RDD depends on a small number
of partitions of the parent RDD.
|
OneToOneDependency<T> | Developer API
Represents a one-to-one dependency between partitions of the parent and child RDDs.
|
Partitioner |
An object that defines how the elements in a key-value pair RDD are partitioned by key.
|
RangeDependency<T> | Developer API
Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.
|
RangePartitioner<K,V> |
A
Partitioner that partitions sortable records by range into roughly
equal ranges. |
Resubmitted | Developer API
A
org.apache.spark.scheduler.ShuffleMapTask that completed successfully earlier, but we
lost the executor before the stage completed. |
SerializableWritable<T extends org.apache.hadoop.io.Writable> | |
ShuffleDependency<K,V,C> | Developer API
Represents a dependency on the output of a shuffle stage.
|
ShuffleStatus |
Helper class used by the
MapOutputTrackerMaster to perform bookkeeping for a single
ShuffleMapStage. |
SimpleFutureAction<T> |
A
FutureAction holding the result of an action that triggers a single job. |
SparkConf |
Configuration for a Spark application.
|
SparkContext |
Main entry point for Spark functionality.
|
SparkEnv | Developer API
Holds all the runtime environment objects for a running Spark instance (either master or worker),
including the serializer, RpcEnv, block manager, map output tracker, etc.
|
SparkExecutorInfoImpl | |
SparkFiles |
Resolves paths to files added through
SparkContext.addFile() . |
SparkFirehoseListener |
Class that allows users to receive all SparkListener events.
|
SparkJobInfoImpl | |
SparkMasterRegex |
A collection of regexes for extracting information from the master string.
|
SparkStageInfoImpl | |
SparkStatusTracker |
Low-level status reporting APIs for monitoring job and stage progress.
|
SpillListener |
A
SparkListener that detects whether spills have occurred in Spark jobs. |
StopMapOutputTracker | |
Success | Developer API
Task succeeded.
|
TaskCommitDenied | Developer API
Task requested the driver to commit, but was denied.
|
TaskContext |
Contextual information about a task which can be read or mutated during
execution.
|
TaskKilled | Developer API
Task was killed intentionally and needs to be rescheduled.
|
TaskResultLost | Developer API
The task finished successfully, but the result was lost from the executor's block manager before
it was fetched.
|
TaskSchedulerIsSet |
An event that SparkContext uses to notify HeartbeatReceiver that SparkContext.taskScheduler is
created.
|
TaskState | |
TestUtils |
Utilities for tests.
|
UnknownReason | Developer API
We don't know why the task ended -- for example, because of a ClassNotFound exception when
deserializing the task result.
|
Enum | Description |
---|---|
JobExecutionStatus |
Exception | Description |
---|---|
SparkException | |
TaskKilledException | Developer API
Exception thrown when a task is explicitly killed (i.e., task failure is expected).
|
Accumulator
and StorageLevel
, are also used in Java, but the
org.apache.spark.api.java
package contains the main Java API.