:: DeveloperApi :: Thrown when a query fails to analyze, usually because the query itself is invalid.
:: Experimental :: A column that will be computed based on the data in a DataFrame.
:: Experimental :: A convenient class used for constructing schema.
:: Experimental :: A distributed collection of data organized into named columns.
A container for a DataFrame, used for implicit conversions.
:: Experimental :: Functionality for working with missing data in DataFrames.
:: Experimental :: Interface used to load a DataFrame from external storage systems (e.
:: Experimental :: Statistic functions for DataFrames.
:: Experimental :: Interface used to write a DataFrame to external storage systems (e.
:: Experimental :: A Dataset is a strongly typed collection of objects that can be transformed in parallel using functional or relational operations.
A container for a Dataset, used for implicit conversions.
:: Experimental ::
Used to convert a JVM object of type T
to and from the internal Spark SQL representation.
:: Experimental :: Holder for experimental methods for the bravest.
:: Experimental :: A set of methods for aggregations on a DataFrame, created by DataFrame.groupBy.
:: Experimental :: A Dataset has been logically grouped by a user specified grouping key.
Represents one row of output from a relational operator.
The entry point for working with structured data (rows and columns) in Spark.
A collection of implicit methods for converting common Scala objects into DataFrames.
Converts a logical plan into zero or more SparkPlans.
Converts a logical plan into zero or more SparkPlans. This API is exposed for experimenting with the query planner and is not designed to be stable across spark releases. Developers writing libraries should instead consider using the stable APIs provided in org.apache.spark.sql.sources
A Column where an Encoder has been given for the expected input and return type.
Functions for registering user-defined functions.
A user-defined function.
Type alias for DataFrame.
Type alias for DataFrame. Kept here for backward source compatibility for Scala.
(Since version 1.3.0) use DataFrame
:: Experimental :: Methods for creating an Encoder.
This SQLContext object contains utility functions to create a singleton SQLContext instance, or to get the created SQLContext instance.
Contains API classes that are specific to a single language (i.
:: Experimental :: Functions available for DataFrame.
Support for running Spark SQL queries using functionality from Apache Hive (does not require an existing Hive installation).
A set of APIs for adding data sources to Spark SQL.
Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.
Allows the execution of relational queries, including those expressed in SQL using Spark.