Package org.apache.spark.util
Class AccumulatorContext
Object
org.apache.spark.util.AccumulatorContext
An internal class used to track accumulators by Spark itself.
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic voidclear()Clears all registeredAccumulatorV2s.static scala.Option<AccumulatorV2<?,?>> get(long id) Returns theAccumulatorV2registered with the given ID, if any.static scala.Option<Object>internOption(scala.Option<Object> value) Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static longnewId()Returns a globally unique ID for a newAccumulatorV2.static intReturns the number of accumulators registered.static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static voidregister(AccumulatorV2<?, ?> a) Registers anAccumulatorV2created on the driver such that it can be used on the executors.static voidremove(long id) Unregisters theAccumulatorV2with the given ID, if any.
-
Constructor Details
-
AccumulatorContext
public AccumulatorContext()
-
-
Method Details
-
newId
public static long newId()Returns a globally unique ID for a newAccumulatorV2. Note: Once you copy theAccumulatorV2the ID is no longer unique.- Returns:
- (undocumented)
-
numAccums
public static int numAccums()Returns the number of accumulators registered. Used in testing. -
register
Registers anAccumulatorV2created on the driver such that it can be used on the executors.All accumulators registered here can later be used as a container for accumulating partial values across multiple tasks. This is what
org.apache.spark.scheduler.DAGSchedulerdoes. Note: if an accumulator is registered here, it should also be registered with the active context cleaner for cleanup so as to avoid memory leaks.If an
AccumulatorV2with the same ID was already registered, this does nothing instead of overwriting it. We will never register same accumulator twice, this is just a sanity check.- Parameters:
a- (undocumented)
-
remove
public static void remove(long id) Unregisters theAccumulatorV2with the given ID, if any.- Parameters:
id- (undocumented)
-
get
Returns theAccumulatorV2registered with the given ID, if any.- Parameters:
id- (undocumented)- Returns:
- (undocumented)
-
clear
public static void clear()Clears all registeredAccumulatorV2s. For testing only. -
internOption
Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.- Parameters:
value- (undocumented)- Returns:
- (undocumented)
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
-