Object

com.alpine.plugin.core.utils

SparkParameterUtils

Related Doc: package utils

Permalink

object SparkParameterUtils

Convenience functions for directly adding Spark related options to the dialog window.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkParameterUtils
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def addStandardSparkOptions(operatorDialog: OperatorDialog, additionalSparkParameters: List[SparkParameter]): Unit

    Permalink
  5. def addStandardSparkOptionsWithStorageLevel(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int, defaultStorageLevel: String, additionalSparkParameters: List[SparkParameter] = List.empty[SparkParameter]): Unit

    Permalink

    A more advanced method for adding SparkP Parameters.

    A more advanced method for adding SparkP Parameters. Will also add a "StorageLevel" Parameter which will indicate what level of persistence to use within a Spark job. NOTE: The storage level parameter cannot be set automatically during runtime. To have any effect the custom operator developer must implement RDD persistence with this value (retrievable with 'getStorageLevel' method) in the Spark Job class of their operator.

    defaultStorageLevel

    - default storage level e.g. NONE or "MEMORY_AND_DISK.

    additionalSparkParameters

    - a list of a additional Spark Parameters.

  6. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. val disableDynamicAllocationParamId: String

    Permalink
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def getRepartition(operatorParameters: OperatorParameters): Boolean

    Permalink

    Return true if the repartition parameter was added AND the user checked it.

    Return true if the repartition parameter was added AND the user checked it. Return false otherwise. Note that because this parameter is exposed as a check box by the alpine engine, the value of the parameter will be either "true" or "false" (string representation of java booleans).

  14. def getStorageLevel(operatorParameters: OperatorParameters): Option[String]

    Permalink

    Retrieve storage level param added via "makeStorageLevelParam" from advanced parameters box.

    Retrieve storage level param added via "makeStorageLevelParam" from advanced parameters box. Return NONE if the parameter was not added. NOTE: this method does not validate the String, so if the users put in an invalid storage level parameter, calling StorageLevel.fromString(s) on the result of this method will fail.

  15. def getUserSetNumPartitions(operatorParameters: OperatorParameters): Option[Int]

    Permalink

    Retrieve the value of the number of partitions parameter added to the advanced spark box.

    Retrieve the value of the number of partitions parameter added to the advanced spark box. However, the CO developer should be aware that Alpine Auto Tuning determines an optimal number of partitions to use and sets that value to spark.default.parallelism. So before repartitioning the developer should check that that is set if this method returns None.

  16. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  17. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  18. def makeNumPartitionsParam: SparkParameter

    Permalink

    Create a "Number of Partitions" parameter to let the user determine how many partitions should be used either when repartitioning the data (controlled by above param) or in when shuffling generally.

    Create a "Number of Partitions" parameter to let the user determine how many partitions should be used either when repartitioning the data (controlled by above param) or in when shuffling generally. If this parameter is not set, a value will be selected by auto tuning. If a value is selected this value will be used to set the "spark.default.parallelism" (or "spark.sql.shuffle.partitions" for Spark SQL) parameters which controls the default number of parameter used in a wide transformation in Spark.

  19. def makeRepartitionParam: SparkParameter

    Permalink

    Create a "Repartition Data" checkbox to let the user determine whether the input data should be shuffled to increase the number of partitions.

  20. def makeStorageLevelParam(defaultStorageLevel: String): SparkParameter

    Permalink

    Add storage level param.

    Add storage level param. Default must be the string representation of a Spark Storage Level e.g. "MEMORY_AND_DISK"

  21. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  22. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  23. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  24. val numPartitionsId: String

    Permalink
  25. val repartitionRDDId: String

    Permalink
  26. val sparkDriverMBElementId: String

    Permalink
  27. val sparkExecutorMBElementId: String

    Permalink
  28. val sparkNumExecutorCoresElementId: String

    Permalink
  29. val sparkNumExecutorsElementId: String

    Permalink
  30. val storageLevelParamId: String

    Permalink
  31. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  32. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  33. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def addStandardSparkOptions(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int): Unit

    Permalink
    Annotations
    @deprecated
    Deprecated
  2. def addStandardSparkOptions(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int, additionalSparkParameters: List[SparkParameter]): Unit

    Permalink
    Annotations
    @deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped