Package

org.apache.spark.api

java

Permalink

package java

Spark Java programming APIs.

Source
package.scala
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. java
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class JavaDoubleRDD extends AbstractJavaRDDLike[Double, JavaDoubleRDD]

    Permalink
  2. trait JavaFutureAction[T] extends Future[T]

    Permalink
  3. class JavaHadoopRDD[K, V] extends JavaPairRDD[K, V]

    Permalink
    Annotations
    @DeveloperApi()
  4. class JavaNewHadoopRDD[K, V] extends JavaPairRDD[K, V]

    Permalink
    Annotations
    @DeveloperApi()
  5. class JavaPairRDD[K, V] extends AbstractJavaRDDLike[(K, V), JavaPairRDD[K, V]]

    Permalink
  6. class JavaRDD[T] extends AbstractJavaRDDLike[T, JavaRDD[T]]

    Permalink
  7. trait JavaRDDLike[T, This <: JavaRDDLike[T, This]] extends Serializable

    Permalink

    Defines operations common to several Java RDD implementations.

    Defines operations common to several Java RDD implementations.

    Note

    This trait is not intended to be implemented by user code.

  8. class JavaSparkContext extends JavaSparkContextVarargsWorkaround with Closeable

    Permalink

    A Java-friendly version of org.apache.spark.SparkContext that returns org.apache.spark.api.java.JavaRDDs and works with Java collections instead of Scala ones.

    A Java-friendly version of org.apache.spark.SparkContext that returns org.apache.spark.api.java.JavaRDDs and works with Java collections instead of Scala ones.

    Only one SparkContext may be active per JVM. You must stop() the active SparkContext before creating a new one. This limitation may eventually be removed; see SPARK-2243 for more details.

  9. class JavaSparkStatusTracker extends AnyRef

    Permalink

    Low-level status reporting APIs for monitoring job and stage progress.

    Low-level status reporting APIs for monitoring job and stage progress.

    These APIs intentionally provide very weak consistency semantics; consumers of these APIs should be prepared to handle empty / missing information. For example, a job's stage ids may be known but the status API may not have any information about the details of those stages, so getStageInfo could potentially return null for a valid stage id.

    To limit memory usage, these APIs only provide information on recent jobs / stages. These APIs will provide information for the last spark.ui.retainedStages stages and spark.ui.retainedJobs jobs.

    Note

    This class's constructor should be considered private and may be subject to change.

  10. final class Optional[T] extends Serializable

    Permalink
  11. class StorageLevels extends AnyRef

    Permalink

Value Members

  1. object JavaDoubleRDD extends Serializable

    Permalink
  2. object JavaPairRDD extends Serializable

    Permalink
  3. object JavaRDD extends Serializable

    Permalink
  4. object JavaSparkContext

    Permalink
  5. package function

    Permalink

    Set of interfaces to represent functions in Spark's Java API.

    Set of interfaces to represent functions in Spark's Java API. Users create implementations of these interfaces to pass functions to various Java API methods for Spark. Please visit Spark's Java programming guide for more details.

Inherited from AnyRef

Inherited from Any

Ungrouped