Package org.apache.spark.util
Class RpcUtils
Object
org.apache.spark.util.RpcUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic org.apache.spark.rpc.RpcTimeoutaskRpcTimeout(SparkConf conf) Returns the default Spark timeout to use for RPC ask operations.static org.apache.spark.rpc.RpcTimeoutInfinite timeout is used internally, so there's no timeout configuration property that controls it.static org.apache.spark.rpc.RpcTimeoutlookupRpcTimeout(SparkConf conf) Returns the default Spark timeout to use for RPC remote endpoint lookup.static org.apache.spark.rpc.RpcEndpointRefmakeDriverRef(String name, SparkConf conf, org.apache.spark.rpc.RpcEnv rpcEnv) Retrieve aRpcEndpointRefwhich is located in the driver via its name.static intmaxMessageSizeBytes(SparkConf conf) Returns the configured max message size for messages in bytes.
-
Constructor Details
-
RpcUtils
public RpcUtils()
-
-
Method Details
-
makeDriverRef
public static org.apache.spark.rpc.RpcEndpointRef makeDriverRef(String name, SparkConf conf, org.apache.spark.rpc.RpcEnv rpcEnv) Retrieve aRpcEndpointRefwhich is located in the driver via its name.- Parameters:
name- (undocumented)conf- (undocumented)rpcEnv- (undocumented)- Returns:
- (undocumented)
-
askRpcTimeout
Returns the default Spark timeout to use for RPC ask operations. -
lookupRpcTimeout
Returns the default Spark timeout to use for RPC remote endpoint lookup. -
INFINITE_TIMEOUT
public static org.apache.spark.rpc.RpcTimeout INFINITE_TIMEOUT()Infinite timeout is used internally, so there's no timeout configuration property that controls it. Therefore, we use "infinite" without any specific reason as its timeout configuration property. And its timeout property should never be accessed since infinite means we never timeout.- Returns:
- (undocumented)
-
maxMessageSizeBytes
Returns the configured max message size for messages in bytes.
-