Packages

abstract class SparkApp[E <: Env] extends AnyRef

During the development lifecycle of Spark applications, it is useful to create sandbox environments comprising paths and Hive databases etc. which are tied to specific logical environments (e.g. dev, test, prod) and feature development (i.e Git branches). e.g. when working on a feature called new_feature for a project called my_project, the application should write its data to paths under /data/dev/my_project/new_feature/ and create tables in a database called dev_my_project_new_feature (actual implementation of what these environments should look like can be defined by extending Env or one of its subclasses - the final implementation should be a case class whose values define the environment i.e env, branch etc.)

This is a generic Spark Application which uses an implementation of Env to generate application-specific configuration and subsequently parse this configuration into a case class to be used for the application logic.

E

the type of the Env implementation (must be a case class)

Linear Supertypes
AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkApp
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkApp()(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[E])

Abstract Value Members

  1. abstract def confDefaults(env: E): Map[String, String]

    Default Spark configuration values to use for the application

    Default Spark configuration values to use for the application

    env

    the environment

    returns

    a map containing default Spark configuration

  2. abstract def run(sparkSession: SparkSession, env: E): Unit

    Run the application for given environment and configuration case classes

    Run the application for given environment and configuration case classes

    sparkSession

    the SparkSession

    env

    the environment

    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def cleanupEnv(sparkSession: SparkSession, envPrefix: String): Unit

    Cleans up the environment associated with this application

    Cleans up the environment associated with this application

    sparkSession

    the SparkSession

    envPrefix

    the prefix for keys in the SparkConf needed by the Env implementation

  6. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  7. def createEnv(sparkSession: SparkSession, envPrefix: String): Unit

    Create the environment associated with this application

    Create the environment associated with this application

    sparkSession

    the SparkSession

    envPrefix

    the prefix for keys in the SparkConf needed by the Env implementation

  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. def parseEnv(sparkSession: SparkSession, envPrefix: String): E

    Parses configuration in the SparkSession into the environment case class (type E)

    Parses configuration in the SparkSession into the environment case class (type E)

    sparkSession

    the SparkSession

    envPrefix

    the prefix for keys in the SparkConf needed by the Env implementation

    returns

    a parsed case class of type E

  18. def runSparkApp(sparkSession: SparkSession, envPrefix: String): Unit

    Runs the application

    Runs the application

    N.B does not create the environment - use createEnv

    sparkSession

    the SparkSession

    envPrefix

    the prefix for keys in the SparkConf needed by the Env implementation

  19. def runWithEnv(env: E, sparkSession: SparkSession): Unit
    Attributes
    protected
  20. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  21. def toString(): String
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped