This is the function that gets called when the workflow is run and the operator starts running.
This is the function that gets called when the workflow is run and the operator starts running.
Execution context of the operator.
The input to the operator.
The parameter values to the operator.
The listener object to communicate information back to the console.
The output from the execution.
This is called when the user clicks on 'stop'.
This is called when the user clicks on 'stop'. If the operator is currently running, this function gets called while 'onExecution' is still running. So it's the developer's responsibility to properly stop whatever is going within 'onExecution'.
Execution context of the operator.
The listener object to communicate information back to the console.
Defines the behavior of your plugin after the end user 'runs' it from the GUI. This is a direct descendant of OperatorRuntime and it takes SparkExecutionContext as an argument. Its 'onExecution' runs on the local machine (Alpine machine). It is in this 'onExecution' method (call with'super.onExecution') that you could submit a Spark job. Unlike its less generic descendant, SparkRuntimeWithIOTypedJob, which automatically submits a SparkJob, you can use this class for more elaborate runtime behavior. For example, you can do some local processing here before submitting a SparkPluginJob to the cluster manually. Or you could can also do some purely local computations and just return without submitting a job.
the IOBase input type of your plugin (must be consistent with the input type of the GUINode class implementation, the plugin signature implementation.
the output of your plugin.