com.alpine.plugin.core.spark

SparkIOTypedPluginJob

abstract class SparkIOTypedPluginJob[I, O] extends AnyRef

:: AlpineSdkApi ::

This is an extension of SparkPluginJob and it handles the serialization/deserialization of Inputs and Outputs. It enables you to directly work with IOBase objects without needing to implement your own (de)serialization logic. This class is intended to be coupled with SparkRuntimeWithIOTypedJob, a descendant of SparkRuntime that takes a descendant of this class as a generic parameter.

Note: It is possible to use this class with a runtime class that extends the generic SparkRuntime class (rather than the SparkRuntimeWithIOTypedJob class). However, by using SparkRuntimeWithIOTypedJob and overriding the onExecution method, you can get many of the benefits of the class while implementing more complex behavior. In taking the later approach you can use the SparkRuntimeWithIOTypedJob implementation of the 'onExecution' method as a utility function for submitting the Spark job by calling super.onExecution.

I

input type of your plugin must be consistent with the SparkRuntime implementation's type parameters.

O

output type of your plugin

Annotations
@AlpineSdkApi()
Linear Supertypes
AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkIOTypedPluginJob
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkIOTypedPluginJob()

Abstract Value Members

  1. abstract def onExecution(sparkContext: SparkContext, appConf: Map[String, String], input: I, operatorParameters: OperatorParameters, listener: OperatorListener): O

    The driver function for the Spark job.

    The driver function for the Spark job. Unlike the corresponding function in the parent class, this function allows you to work with IOBase types directly.

    sparkContext

    Spark context created when the Spark job was submitted

    appConf

    a map containing system related parameters (rather than operator parameters) including all Spark parameters, workflow-level variables

    input

    the ioBase object which you have defined as the input to your plugin. For example, if the GUI node of the plugin takes an HDFSTabularDataset, this input parameter will be that dataset.

    listener

    a listener object which allows you to send messages to the Alpine GUI during the Spark job

    returns

    the output of your plugin

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  18. def toString(): String

    Definition Classes
    AnyRef → Any
  19. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped