org.apache.spark.mllib.classification

LogisticRegressionWithSGD

object LogisticRegressionWithSGD extends Serializable

Top-level methods for calling Logistic Regression using Stochastic Gradient Descent. NOTE: Labels used in Logistic Regression should be {0, 1}

Annotations
@Since( "0.8.0" )
Source
LogisticRegression.scala
Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. LogisticRegressionWithSGD
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  18. def toString(): String

    Definition Classes
    AnyRef → Any
  19. def train(input: RDD[LabeledPoint], numIterations: Int): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using a step size of 1.0. We use the entire data set to update the gradient in each iteration. NOTE: Labels used in Logistic Regression should be {0, 1}

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    returns

    a LogisticRegressionModel which has the weights and offset from training.

    Annotations
    @Since( "1.0.0" )
  20. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. We use the entire data set to update the gradient in each iteration. NOTE: Labels used in Logistic Regression should be {0, 1}

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of Gradient Descent.

    returns

    a LogisticRegressionModel which has the weights and offset from training.

    Annotations
    @Since( "1.0.0" )
  21. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, miniBatchFraction: Double): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses miniBatchFraction fraction of the data to calculate the gradient. NOTE: Labels used in Logistic Regression should be {0, 1}

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of gradient descent.

    miniBatchFraction

    Fraction of data to be used per iteration.

    Annotations
    @Since( "1.0.0" )
  22. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, miniBatchFraction: Double, initialWeights: Vector): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses miniBatchFraction fraction of the data to calculate the gradient. The weights used in gradient descent are initialized using the initial weights provided. NOTE: Labels used in Logistic Regression should be {0, 1}

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of gradient descent.

    miniBatchFraction

    Fraction of data to be used per iteration.

    initialWeights

    Initial set of weights to be used. Array should be equal in size to the number of features in the data.

    Annotations
    @Since( "1.0.0" )
  23. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped