com.alpine.plugin.test.mock

SparkExecutionContextMock

class SparkExecutionContextMock extends SparkExecutionContext

This is a mock version of SparkExecutionContext, for use in tests. This defines the HDFSVisualModelHelper as HDFSVisualModelHelperMock, and chorusAPICaller according to the argument.

It can be extended for different behaviour (e.g. mocking the file system).

Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkExecutionContextMock
  2. SparkExecutionContext
  3. ExecutionContext
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkExecutionContextMock(chorusAPICallerMock: ChorusAPICaller)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def appendPath(path: String): OutputStream

    Append contents to the given HDFS path.

    Append contents to the given HDFS path.

    path

    The HDFS path that we want to append to.

    returns

    OutputStream corresponding to the path.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def chorusAPICaller: ChorusAPICaller

  9. def chorusUserInfo: ChorusUserInfo

  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. def config: CustomOperatorConfig

  12. def createPath(path: String, overwrite: Boolean): OutputStream

    Create a HDFS path for writing.

    Create a HDFS path for writing.

    path

    The HDFS path that we want to create and write to.

    overwrite

    Whether to overwrite the given path if it exists.

    returns

    OutputStream corresponding to the path.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  13. def deletePath(path: String, recursive: Boolean): Boolean

    Delete the given HDFS path.

    Delete the given HDFS path.

    path

    The HDFS path that we want to delete.

    recursive

    If it's a directory, whether we want to delete the directory recursively.

    returns

    true if successful, false otherwise.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  14. def doHdfsAction[T](fs: (FileSystem) ⇒ T): T

  15. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  16. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  17. def exists(path: String): Boolean

    Determine whether the given path exists in the HDFS or not.

    Determine whether the given path exists in the HDFS or not.

    path

    The path that we want to check.

    returns

    true if it exists, false otherwise.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  18. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  20. def getSparkAutoTunedParameters[I <: IOBase, JOB <: SparkIOTypedPluginJob[I, _]](jobClass: Class[JOB], input: I, params: OperatorParameters, sparkConf: SparkJobConfiguration, listener: OperatorListener): Nothing

    Returns the map of Spark parameters after autoTuning algorithm is applied.

    Returns the map of Spark parameters after autoTuning algorithm is applied. This is the final set of Spark properties ready to be passed with Spark job submission (except spark.job.name) It leverages: - user-defined Spark properties at the operator level (Spark Advanced Settings box), workflow level and data source level (in this order of precedence). - AutoTunerOptions set in the SparkJobConfiguration (@param sparkConf) - auto-tuned parameters that were not user-specified

    I

    Input type.

    jobClass

    IO typed job class.

    input

    Input to the job. This automatically gets serialized.

    params

    Parameters into the job.

    sparkConf

    Spark job configuration.

    listener

    Listener to pass to the job. The spark job should be able to communicate directly with Alpine as it's running.

    returns

    The map of relevant Spark properties after auto tuning algorithm was applied.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  21. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  22. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  23. def mkdir(path: String): Boolean

    Create the directory path.

    Create the directory path.

    path

    The directory path that we want to create.

    returns

    true if it succeeds, false otherwise.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  24. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  25. final def notify(): Unit

    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  27. def openPath(path: String): InputStream

    Open a HDFS path for reading.

    Open a HDFS path for reading.

    path

    The HDFS path that we want to read from.

    returns

    InputStream corresponding to the path.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  28. def recommendedTempDir: File

  29. def submitJob[I <: IOBase, O <: IOBase, JOB <: SparkIOTypedPluginJob[I, O]](jobClass: Class[JOB], input: I, params: OperatorParameters, sparkConf: SparkJobConfiguration, listener: OperatorListener): SubmittedSparkJob[O]

    This is to be the function to submit the IO typed job to Spark.

    This is to be the function to submit the IO typed job to Spark. IO typed Spark jobs will automatically serialize/deserialize input/outputs. TODO: Not supported as of yet.

    I

    Input type.

    O

    Output type.

    JOB

    The job type.

    jobClass

    IO typed job class.

    input

    Input to the job. This automatically gets serialized.

    params

    Parameters into the job.

    sparkConf

    Spark job configuration.

    listener

    Listener to pass to the job. The spark job should be able to communicate directly with Alpine as it's running.

    returns

    A submitted job object.

    Definition Classes
    SparkExecutionContextMockSparkExecutionContext
  30. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  31. def toString(): String

    Definition Classes
    AnyRef → Any
  32. def visualModelHelper: HDFSVisualModelHelper

  33. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. def workflowInfo: WorkflowInfo

Inherited from SparkExecutionContext

Inherited from ExecutionContext

Inherited from AnyRef

Inherited from Any

Ungrouped