com.alpine.plugin.core.utils

SparkParameterUtils

object SparkParameterUtils

Convenience functions for directly adding Spark related options to the dialog window.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkParameterUtils
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def addStandardSparkOptions(operatorDialog: OperatorDialog, additionalSparkParameters: List[SparkParameter]): Unit

  7. def addStandardSparkOptionsWithStorageLevel(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int, defaultStorageLevel: String, additionalSparkParameters: List[SparkParameter] = List.empty[SparkParameter]): Unit

    A more advanced method for adding SparkP Parameters.

    A more advanced method for adding SparkP Parameters. Will also add a "StorageLevel" Parameter which will indicate what level of persistence to use within a Spark job. NOTE: The storage level parameter cannot be set automatically during runtime. To have any effect the custom operator developer must implement RDD persistence with this value (retrievable with 'getStorageLevel' method) in the Spark Job class of their operator.

    defaultStorageLevel

    - default storage level e.g. NONE or "MEMORY_AND_DISK.

    additionalSparkParameters

    - a list of a additional Spark Parameters.

  8. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. val disableDynamicAllocationParamId: String

  11. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  15. def getRepartition(operatorParameters: OperatorParameters): Boolean

    Return true if the repartition parameter was added AND the user checked it.

    Return true if the repartition parameter was added AND the user checked it. Return false otherwise. Note that because this parameter is exposed as a check box by the alpine engine, the value of the parameter will be either "true" or "false" (string representation of java booleans).

  16. def getStorageLevel(operatorParameters: OperatorParameters): Option[String]

    Retrieve storage level param added via "makeStorageLevelParam" from advanced parameters box.

    Retrieve storage level param added via "makeStorageLevelParam" from advanced parameters box. Return NONE if the parameter was not added. NOTE: this method does not validate the String, so if the users put in an invalid storage level parameter, calling StorageLevel.fromString(s) on the result of this method will fail.

  17. def getUserSetNumPartitions(operatorParameters: OperatorParameters): Option[Int]

    Retrieve the value of the number of partitions parameter added to the advanced spark box.

    Retrieve the value of the number of partitions parameter added to the advanced spark box. However, the CO developer should be aware that Alpine Auto Tuning determines an optimal number of partitions to use and sets that value to spark.default.parallelism. So before repartitioning the developer should check that that is set if this method returns None.

  18. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  20. def makeNumPartitionsParam: SparkParameter

    Create a "Number of Partitions" parameter to let the user determine how many partitions should be used either when repartitioning the data (controlled by above param) or in when shuffling generally.

    Create a "Number of Partitions" parameter to let the user determine how many partitions should be used either when repartitioning the data (controlled by above param) or in when shuffling generally. If this parameter is not set, a value will be selected by auto tuning. If a value is selected this value will be used to set the "spark.default.parallelism" parameter which controls the default number of parameter used in a wide transformation in Spark.

    returns

  21. def makeRepartitionParam: SparkParameter

    Create a "Repartition Data" checkbox to let the user determine whether the input data should be shuffled to increase the number of partitions.

  22. def makeStorageLevelParam(defaultStorageLevel: String): SparkParameter

    Add storage level param.

    Add storage level param. Default must be the string representation of a Spark Storage Level e.g. "MEMORY_AND_DISK"

    defaultStorageLevel
    returns

  23. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  24. final def notify(): Unit

    Definition Classes
    AnyRef
  25. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  26. val numPartitionsId: String

  27. val repartitionRDDId: String

  28. val sparkDriverMBElementId: String

  29. val sparkExecutorMBElementId: String

  30. val sparkNumExecutorCoresElementId: String

  31. val sparkNumExecutorsElementId: String

  32. val storageLevelParamId: String

  33. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  34. def toString(): String

    Definition Classes
    AnyRef → Any
  35. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def addStandardSparkOptions(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int): Unit

    Annotations
    @deprecated
    Deprecated
  2. def addStandardSparkOptions(operatorDialog: OperatorDialog, defaultNumExecutors: Int, defaultExecutorMemoryMB: Int, defaultDriverMemoryMB: Int, defaultNumExecutorCores: Int, additionalSparkParameters: List[SparkParameter]): Unit

    Annotations
    @deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped