Package org.apache.spark.streaming.util
Class WriteAheadLogUtils
Object
org.apache.spark.streaming.util.WriteAheadLogUtils
A helper class with utility functions related to the WriteAheadLog interface
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic WriteAheadLogcreateLogForDriver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver.static WriteAheadLogcreateLogForReceiver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver.static booleanenableReceiverLog(SparkConf conf) static longgetBatchingTimeout(SparkConf conf) How long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.static intgetMaxFailures(SparkConf conf, boolean isDriver) static intgetRollingIntervalSecs(SparkConf conf, boolean isDriver) static booleanisBatchingEnabled(SparkConf conf, boolean isDriver) static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static booleanshouldCloseFileAfterWrite(SparkConf conf, boolean isDriver)
-
Constructor Details
-
WriteAheadLogUtils
public WriteAheadLogUtils()
-
-
Method Details
-
enableReceiverLog
-
getRollingIntervalSecs
-
getMaxFailures
-
isBatchingEnabled
-
getBatchingTimeout
How long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.- Parameters:
conf- (undocumented)- Returns:
- (undocumented)
-
shouldCloseFileAfterWrite
-
createLogForDriver
public static WriteAheadLog createLogForDriver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
sparkConf- (undocumented)fileWalLogDirectory- (undocumented)fileWalHadoopConf- (undocumented)- Returns:
- (undocumented)
-
createLogForReceiver
public static WriteAheadLog createLogForReceiver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
sparkConf- (undocumented)fileWalLogDirectory- (undocumented)fileWalHadoopConf- (undocumented)- Returns:
- (undocumented)
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
-