Package org.apache.spark.resource
Class ResourceProfile
Object
org.apache.spark.resource.ResourceProfile
- All Implemented Interfaces:
Serializable,org.apache.spark.internal.Logging
public class ResourceProfile
extends Object
implements Serializable, org.apache.spark.internal.Logging
Resource profile to associate with an RDD. A ResourceProfile allows the user to
specify executor and task requirements for an RDD that will get applied during a
stage. This allows the user to change the resource requirements between stages.
This is meant to be immutable so user can't change it after building. Users
should use
ResourceProfileBuilder to build it.
param: executorResources Resource requests for executors. Mapped from the resource name (e.g., cores, memory, CPU) to its specific request. param: taskResources Resource requests for tasks. Mapped from the resource name (e.g., cores, memory, CPU) to its specific request.
- See Also:
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic classstatic classNested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter -
Constructor Summary
ConstructorsConstructorDescriptionResourceProfile(scala.collection.immutable.Map<String, ExecutorResourceRequest> executorResources, scala.collection.immutable.Map<String, TaskResourceRequest> taskResources) -
Method Summary
Modifier and TypeMethodDescriptionstatic String[]Return all supported Spark built-in executor resources, custom resources like GPUs/FPGAs are excluded.static StringCORES()built-in executor resource: coresstatic StringCPUS()built-in task resource: cpusstatic intbooleanscala.collection.immutable.Map<String,ExecutorResourceRequest> (Java-specific) gets a Java Map of resources to ExecutorResourceRequestinthashCode()intid()A unique id of this ResourceProfilestatic org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static StringMEMORY()built-in executor resource: memorystatic Stringbuilt-in executor resource: offHeapstatic org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static Stringbuilt-in executor resource: memoryOverheadstatic Stringbuilt-in executor resource: pyspark.memoryscala.collection.immutable.Map<String,TaskResourceRequest> (Java-specific) gets a Java Map of resources to TaskResourceRequesttoString()static intMethods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logBasedOnLevel, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
-
Constructor Details
-
ResourceProfile
public ResourceProfile(scala.collection.immutable.Map<String, ExecutorResourceRequest> executorResources, scala.collection.immutable.Map<String, TaskResourceRequest> taskResources)
-
-
Method Details
-
CPUS
built-in task resource: cpus- Returns:
- (undocumented)
-
CORES
built-in executor resource: cores- Returns:
- (undocumented)
-
MEMORY
built-in executor resource: memory- Returns:
- (undocumented)
-
OFFHEAP_MEM
built-in executor resource: offHeap- Returns:
- (undocumented)
-
OVERHEAD_MEM
built-in executor resource: memoryOverhead- Returns:
- (undocumented)
-
PYSPARK_MEM
built-in executor resource: pyspark.memory- Returns:
- (undocumented)
-
allSupportedExecutorResources
Return all supported Spark built-in executor resources, custom resources like GPUs/FPGAs are excluded.- Returns:
- (undocumented)
-
UNKNOWN_RESOURCE_PROFILE_ID
public static int UNKNOWN_RESOURCE_PROFILE_ID() -
DEFAULT_RESOURCE_PROFILE_ID
public static int DEFAULT_RESOURCE_PROFILE_ID() -
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc) -
executorResources
-
taskResources
-
id
public int id()A unique id of this ResourceProfile- Returns:
- (undocumented)
-
taskResourcesJMap
(Java-specific) gets a Java Map of resources to TaskResourceRequest- Returns:
- (undocumented)
-
executorResourcesJMap
(Java-specific) gets a Java Map of resources to ExecutorResourceRequest- Returns:
- (undocumented)
-
equals
-
hashCode
public int hashCode() -
toString
-