spark.streaming.input

KafkaFunctions

class KafkaFunctions extends AnyRef

A wrapper around StreamingContext for exposing Kafka functions. This is done in a separate class so that programs that don't use Kafka don't need to link against it. In Scala, an implicit conversion in StreamingContext allows one to call these functions.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. Hide All
  2. Show all
  1. KafkaFunctions
  2. AnyRef
  3. Any
Visibility
  1. Public
  2. All

Instance Constructors

  1. new KafkaFunctions(self: StreamingContext)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws()
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws()
  11. final def getClass(): java.lang.Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. def kafkaStream[T, D <: kafka.serializer.Decoder[_]](kafkaParams: Map[String, String], topics: Map[String, Int], storageLevel: StorageLevel)(implicit arg0: ClassManifest[T], arg1: Manifest[D]): DStream[T]

    Create an input stream that pulls messages from a Kafka Broker.

    Create an input stream that pulls messages from a Kafka Broker.

    kafkaParams

    Map of kafka configuration paramaters. See: http://kafka.apache.org/configuration.html

    topics

    Map of (topic_name -> numPartitions) to consume. Each partition is consumed in its own thread.

    storageLevel

    Storage level to use for storing the received objects

  15. def kafkaStream(zkQuorum: String, groupId: String, topics: Map[String, Int], storageLevel: StorageLevel = StorageLevel.MEMORY_AND_DISK_SER_2): DStream[String]

    Create an input stream that pulls messages from a Kafka Broker.

    Create an input stream that pulls messages from a Kafka Broker.

    zkQuorum

    Zookeper quorum (hostname:port,hostname:port,..).

    groupId

    The group id for this consumer.

    topics

    Map of (topic_name -> numPartitions) to consume. Each partition is consumed in its own thread.

    storageLevel

    Storage level to use for storing the received objects (default: StorageLevel.MEMORY_AND_DISK_SER_2)

  16. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  19. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  20. def toString(): String

    Definition Classes
    AnyRef → Any
  21. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()
  22. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()
  23. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws()

Inherited from AnyRef

Inherited from Any