Class AccumulatorContext

Object
org.apache.spark.util.AccumulatorContext

public class AccumulatorContext extends Object
An internal class used to track accumulators by Spark itself.
  • Constructor Details

    • AccumulatorContext

      public AccumulatorContext()
  • Method Details

    • newId

      public static long newId()
      Returns a globally unique ID for a new AccumulatorV2. Note: Once you copy the AccumulatorV2 the ID is no longer unique.
      Returns:
      (undocumented)
    • numAccums

      public static int numAccums()
      Returns the number of accumulators registered. Used in testing.
    • register

      public static void register(AccumulatorV2<?,?> a)
      Registers an AccumulatorV2 created on the driver such that it can be used on the executors.

      All accumulators registered here can later be used as a container for accumulating partial values across multiple tasks. This is what org.apache.spark.scheduler.DAGScheduler does. Note: if an accumulator is registered here, it should also be registered with the active context cleaner for cleanup so as to avoid memory leaks.

      If an AccumulatorV2 with the same ID was already registered, this does nothing instead of overwriting it. We will never register same accumulator twice, this is just a sanity check.

      Parameters:
      a - (undocumented)
    • remove

      public static void remove(long id)
      Unregisters the AccumulatorV2 with the given ID, if any.
      Parameters:
      id - (undocumented)
    • get

      public static scala.Option<AccumulatorV2<?,?>> get(long id)
      Returns the AccumulatorV2 registered with the given ID, if any.
      Parameters:
      id - (undocumented)
      Returns:
      (undocumented)
    • clear

      public static void clear()
      Clears all registered AccumulatorV2s. For testing only.
    • internOption

      public static scala.Option<Object> internOption(scala.Option<Object> value)
      Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.
      Parameters:
      value - (undocumented)
      Returns:
      (undocumented)
    • org$apache$spark$internal$Logging$$log_

      public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
    • org$apache$spark$internal$Logging$$log__$eq

      public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)