Trait

com.alpine.plugin.core.spark

SparkExecutionContext

Related Doc: package spark

Permalink

trait SparkExecutionContext extends ExecutionContext

:: AlpineSdkApi ::

Annotations
@AlpineSdkApi()
Linear Supertypes
ExecutionContext, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkExecutionContext
  2. ExecutionContext
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def appendPath(path: String): OutputStream

    Permalink

    Append contents to the given HDFS path.

    Append contents to the given HDFS path.

    path

    The HDFS path that we want to append to.

    returns

    OutputStream corresponding to the path.

  2. abstract def chorusAPICaller: ChorusAPICaller

    Permalink
    Definition Classes
    ExecutionContext
  3. abstract def chorusUserInfo: ChorusUserInfo

    Permalink
    Definition Classes
    ExecutionContext
  4. abstract def config: CustomOperatorConfig

    Permalink
    Definition Classes
    ExecutionContext
  5. abstract def createPath(path: String, overwrite: Boolean): OutputStream

    Permalink

    Create a HDFS path for writing.

    Create a HDFS path for writing.

    path

    The HDFS path that we want to create and write to.

    overwrite

    Whether to overwrite the given path if it exists.

    returns

    OutputStream corresponding to the path.

  6. abstract def deletePath(path: String, recursive: Boolean): Boolean

    Permalink

    Delete the given HDFS path.

    Delete the given HDFS path.

    path

    The HDFS path that we want to delete.

    recursive

    If it's a directory, whether we want to delete the directory recursively.

    returns

    true if successful, false otherwise.

  7. abstract def doHdfsAction[T](fs: (FileSystem) ⇒ T): T

    Permalink
  8. abstract def exists(path: String): Boolean

    Permalink

    Determine whether the given path exists in the HDFS or not.

    Determine whether the given path exists in the HDFS or not.

    path

    The path that we want to check.

    returns

    true if it exists, false otherwise.

  9. abstract def getSparkAutoTunedParameters[I <: IOBase, JOB <: SparkIOTypedPluginJob[I, _]](jobClass: Class[JOB], input: I, params: OperatorParameters, sparkConf: SparkJobConfiguration, listener: OperatorListener): Map[String, String]

    Permalink

    Returns the map of Spark parameters after autoTuning algorithm is applied.

    Returns the map of Spark parameters after autoTuning algorithm is applied. This is the final set of Spark properties ready to be passed with Spark job submission (except spark.job.name) It leverages: - user-defined Spark properties at the operator level (Spark Advanced Settings box), workflow level and data source level (in this order of precedence). - AutoTunerOptions set in the SparkJobConfiguration (@param sparkConf) - auto-tuned parameters that were not user-specified

    I

    Input type.

    jobClass

    IO typed job class.

    input

    Input to the job. This automatically gets serialized.

    params

    Parameters into the job.

    sparkConf

    Spark job configuration.

    listener

    Listener to pass to the job. The spark job should be able to communicate directly with Alpine as it's running.

    returns

    The map of relevant Spark properties after auto tuning algorithm was applied.

  10. abstract def mkdir(path: String): Boolean

    Permalink

    Create the directory path.

    Create the directory path.

    path

    The directory path that we want to create.

    returns

    true if it succeeds, false otherwise.

  11. abstract def openPath(path: String): InputStream

    Permalink

    Open a HDFS path for reading.

    Open a HDFS path for reading.

    path

    The HDFS path that we want to read from.

    returns

    InputStream corresponding to the path.

  12. abstract def recommendedTempDir: File

    Permalink
    Definition Classes
    ExecutionContext
  13. abstract def submitJob[I <: IOBase, O <: IOBase, JOB <: SparkIOTypedPluginJob[I, O]](jobClass: Class[JOB], input: I, params: OperatorParameters, sparkConf: SparkJobConfiguration, listener: OperatorListener): SubmittedSparkJob[O]

    Permalink

    This is to be the function to submit the IO typed job to Spark.

    This is to be the function to submit the IO typed job to Spark. IO typed Spark jobs will automatically serialize/deserialize input/outputs. TODO: Not supported as of yet.

    I

    Input type.

    O

    Output type.

    JOB

    The job type.

    jobClass

    IO typed job class.

    input

    Input to the job. This automatically gets serialized.

    params

    Parameters into the job.

    sparkConf

    Spark job configuration.

    listener

    Listener to pass to the job. The spark job should be able to communicate directly with Alpine as it's running.

    returns

    A submitted job object.

  14. abstract def visualModelHelper: HDFSVisualModelHelper

    Permalink
  15. abstract def workflowInfo: WorkflowInfo

    Permalink
    Definition Classes
    ExecutionContext

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  13. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  14. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  16. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  17. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from ExecutionContext

Inherited from AnyRef

Inherited from Any

Ungrouped