com.alpine.plugin.core.spark.templates

SparkDataFrameGUINode

abstract class SparkDataFrameGUINode[Job <: SparkDataFrameJob] extends TemplatedSparkDataFrameGUINode[HdfsTabularDataset]

Control the GUI of your Spark job, through this you can specify any visualization for the output of your job, and what params the user will need to specify.

Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkDataFrameGUINode
  2. TemplatedSparkDataFrameGUINode
  3. OperatorGUINode
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkDataFrameGUINode()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. def defineEntireOutputSchema(inputSchema: TabularSchema, params: OperatorParameters): TabularSchema

    Override this method to define an output schema in some way other than by defining an array of the fixed column definitions.

  9. def defineOutputSchemaColumns(inputSchema: TabularSchema, params: OperatorParameters): Seq[ColumnDef]

    Override this method to define the output schema by assigning fixed column definitions.

    Override this method to define the output schema by assigning fixed column definitions. If you want to have a variable number of output columns, simply override the defineEntireOutputSchema method The default implementation of this method returns the same columns as the input data.

    inputSchema

    - the Alpine 'TabularSchema' for the input DataFrame

    params

    The parameters of the operator, including values set by the user.

    returns

    A list of Column definitions used to create the output schema

  10. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  19. def onInputOrParameterChange(inputSchemas: Map[String, TabularSchema], params: OperatorParameters, operatorSchemaManager: OperatorSchemaManager): OperatorStatus

    Calls 'updateOutputSchema' when the parameters are changed

    Calls 'updateOutputSchema' when the parameters are changed

    inputSchemas

    If the connected inputs contain tabular schemas, this is where they can be accessed, each with unique Ids.

    params

    The current parameter values to the operator.

    operatorSchemaManager

    This should be used to change the input/output schema, etc.

    returns

    A status object about whether the inputs and/or parameters are valid. The default implementation assumes that the connected inputs and/or parameters are valid.

    Definition Classes
    SparkDataFrameGUINodeOperatorGUINode
  20. def onOutputVisualization(params: OperatorParameters, output: HdfsTabularDataset, visualFactory: VisualModelFactory): VisualModel

    This is invoked for GUI to customize the operator output visualization after the operator finishes running.

    This is invoked for GUI to customize the operator output visualization after the operator finishes running. Each output should have associated default visualization, but the developer can customize it here.

    params

    The parameter values to the operator.

    output

    This is the output from running the operator.

    visualFactory

    For creating visual models.

    returns

    The visual model to be sent to the GUI for visualization.

    Definition Classes
    OperatorGUINode
  21. def onPlacement(operatorDialog: OperatorDialog, operatorDataSourceManager: OperatorDataSourceManager, operatorSchemaManager: OperatorSchemaManager): Unit

    Defines the params the user will be able to select.

    Defines the params the user will be able to select. The default asks for desired output format & output location.

    operatorDialog

    The operator dialog where the operator could add input text boxes, etc. to define UI for parameter inputs.

    operatorDataSourceManager

    Before executing the runtime of the operator the developer should determine the underlying platform that the runtime will execute against. E.g., it is possible for an operator to have accesses to two different Hadoop clusters or multiple databases. A runtime can run on only one platform. A default platform will be used if nothing is done.

    operatorSchemaManager

    This can be used to provide information about the nature of the output/input schemas. E.g., provide the output schema.

    Definition Classes
    SparkDataFrameGUINodeOperatorGUINode
  22. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  23. def toString(): String

    Definition Classes
    AnyRef → Any
  24. def updateOutputSchema(inputSchemas: Map[String, TabularSchema], params: OperatorParameters, operatorSchemaManager: OperatorSchemaManager): Unit

    Attributes
    protected
  25. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

internals

Ungrouped