package java
- Alphabetic
- By Inheritance
- java
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Package Members
- package function
Set of interfaces to represent functions in Spark's Java API.
Set of interfaces to represent functions in Spark's Java API. Users create implementations of these interfaces to pass functions to various Java API methods for Spark. Please visit Spark's Java programming guide for more details.
Type Members
- class JavaDoubleRDD extends AbstractJavaRDDLike[Double, JavaDoubleRDD]
- trait JavaFutureAction[T] extends Future[T]
- class JavaHadoopRDD[K, V] extends JavaPairRDD[K, V]
- Annotations
- @DeveloperApi()
- class JavaNewHadoopRDD[K, V] extends JavaPairRDD[K, V]
- Annotations
- @DeveloperApi()
- class JavaPairRDD[K, V] extends AbstractJavaRDDLike[(K, V), JavaPairRDD[K, V]]
- class JavaRDD[T] extends AbstractJavaRDDLike[T, JavaRDD[T]]
- trait JavaRDDLike[T, This <: JavaRDDLike[T, This]] extends Serializable
Defines operations common to several Java RDD implementations.
Defines operations common to several Java RDD implementations.
- Note
This trait is not intended to be implemented by user code.
- class JavaSparkContext extends Closeable
A Java-friendly version of org.apache.spark.SparkContext that returns org.apache.spark.api.java.JavaRDDs and works with Java collections instead of Scala ones.
A Java-friendly version of org.apache.spark.SparkContext that returns org.apache.spark.api.java.JavaRDDs and works with Java collections instead of Scala ones.
- Note
Only one
SparkContext
should be active per JVM. You muststop()
the activeSparkContext
before creating a new one.
- class JavaSparkStatusTracker extends AnyRef
Low-level status reporting APIs for monitoring job and stage progress.
Low-level status reporting APIs for monitoring job and stage progress.
These APIs intentionally provide very weak consistency semantics; consumers of these APIs should be prepared to handle empty / missing information. For example, a job's stage ids may be known but the status API may not have any information about the details of those stages, so
getStageInfo
could potentially returnnull
for a valid stage id.To limit memory usage, these APIs only provide information on recent jobs / stages. These APIs will provide information for the last
spark.ui.retainedStages
stages andspark.ui.retainedJobs
jobs.- Note
This class's constructor should be considered private and may be subject to change.
- final class Optional[T] extends Serializable
Like
java.util.Optional
in Java 8,scala.Option
in Scala, andcom.google.common.base.Optional
in Google Guava, this class represents a value of a given type that may or may not exist.Like
java.util.Optional
in Java 8,scala.Option
in Scala, andcom.google.common.base.Optional
in Google Guava, this class represents a value of a given type that may or may not exist. It is used in methods that wish to optionally return a value, in preference to returningnull
.In fact, the class here is a reimplementation of the essential API of both
java.util.Optional
andcom.google.common.base.Optional
. Fromjava.util.Optional
, it implements:#empty()
#of(Object)
#ofNullable(Object)
#get()
#orElse(Object)
#isPresent()
From
com.google.common.base.Optional
it implements:#absent()
#of(Object)
#fromNullable(Object)
#get()
#or(Object)
#orNull()
#isPresent()
java.util.Optional
itself was not used because at the time, the project did not require Java 8. Usingcom.google.common.base.Optional
has in the past caused serious library version conflicts with Guava that can't be resolved by shading. Hence this work-alike clone. - class StorageLevels extends AnyRef
Expose some commonly useful storage level constants.
Value Members
- object JavaDoubleRDD extends Serializable
- object JavaPairRDD extends Serializable
- object JavaRDD extends Serializable
- object JavaSparkContext