com.alpine.plugin.core.spark

SparkExecutionContext

trait SparkExecutionContext extends ExecutionContext

:: AlpineSdkApi ::

Annotations
@AlpineSdkApi()
Linear Supertypes
ExecutionContext, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkExecutionContext
  2. ExecutionContext
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def appendPath(path: String): OutputStream

    Append contents to the given HDFS path.

    Append contents to the given HDFS path.

    path

    The HDFS path that we want to append to.

    returns

    OutputStream corresponding to the path.

  2. abstract def createPath(path: String, overwrite: Boolean): OutputStream

    Create a HDFS path for writing.

    Create a HDFS path for writing.

    path

    The HDFS path that we want to create and write to.

    overwrite

    Whether to overwrite the given path if it exists.

    returns

    OutputStream corresponding to the path.

  3. abstract def deletePath(path: String, recursive: Boolean): Boolean

    Delete the given HDFS path.

    Delete the given HDFS path.

    path

    The HDFS path that we want to delete.

    recursive

    If it's a directory, whether we want to delete the directory recursively.

    returns

    true if successful, false otherwise.

  4. abstract def exists(path: String): Boolean

    Determine whether the given path exists in the HDFS or not.

    Determine whether the given path exists in the HDFS or not.

    path

    The path that we want to check.

    returns

    true if it exists, false otherwise.

  5. abstract def mkdir(path: String): Boolean

    Create the directory path.

    Create the directory path.

    path

    The directory path that we want to create.

    returns

    true if it succeeds, false otherwise.

  6. abstract def openPath(path: String): InputStream

    Open a HDFS path for reading.

    Open a HDFS path for reading.

    path

    The HDFS path that we want to read from.

    returns

    InputStream corresponding to the path.

  7. abstract def submitJob[I <: IOBase, O <: IOBase, JOB <: SparkIOTypedPluginJob[I, O]](jobClass: Class[JOB], input: I, params: OperatorParameters, sparkConf: SparkJobConfiguration, listener: OperatorListener): SubmittedSparkJob[O]

    This is to be the function to submit the IO typed job to Spark.

    This is to be the function to submit the IO typed job to Spark. IO typed Spark jobs will automatically serialize/deserialize input/outputs. TODO: Not supported as of yet.

    I

    Input type.

    O

    Output type.

    JOB

    The job type.

    jobClass

    IO typed job class.

    input

    Input to the job. This automatically gets serialized.

    params

    Parameters into the job.

    sparkConf

    Spark job configuration.

    listener

    Listener to pass to the job. The spark job should be able to communicate directly with Alpine as it's running.

    returns

    A submitted job object.

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  18. def toString(): String

    Definition Classes
    AnyRef → Any
  19. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from ExecutionContext

Inherited from AnyRef

Inherited from Any

Ungrouped