Packages

abstract class ETL extends Runnable

Defines a common workflow for ETL jobs. By definition an ETL can take 1..n sources as input and can produce only 1 output.

Linear Supertypes
Runnable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ETL
  2. Runnable
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new ETL()(implicit conf: Configuration)

    conf

    application configuration

Abstract Value Members

  1. abstract def extract(lastRunDateTime: LocalDateTime = minDateTime, currentRunDateTime: LocalDateTime = LocalDateTime.now())(implicit spark: SparkSession): Map[String, DataFrame]

    Reads data from a file system and produce a Map[DatasetConf, DataFrame].

    Reads data from a file system and produce a Map[DatasetConf, DataFrame]. This method should avoid transformation and joins but can implement filters in order to make the ETL more efficient.

    spark

    an instance of SparkSession

    returns

    all the data needed to pass to the transform method and produce the desired output.

  2. abstract def mainDestination: DatasetConf
  3. abstract def transform(data: Map[String, DataFrame], lastRunDateTime: LocalDateTime = minDateTime, currentRunDateTime: LocalDateTime = LocalDateTime.now())(implicit spark: SparkSession): Map[String, DataFrame]

    Takes a Map[DatasetConf, DataFrame] as input and apply a set of transformation to it to produce the ETL output.

    Takes a Map[DatasetConf, DataFrame] as input and apply a set of transformation to it to produce the ETL output. It is recommended to not read any additional data but to use the extract() method instead to inject input data.

    data

    input data

    spark

    an instance of SparkSession

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  6. implicit val conf: Configuration
  7. def defaultRepartition: (DataFrame) ⇒ DataFrame
  8. def defaultSampling: PartialFunction[String, (DataFrame) ⇒ DataFrame]
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  12. def getLastRunDateFor(ds: DatasetConf)(implicit spark: SparkSession): LocalDateTime

    If possible, fetch the last run date time from the dataset passed in argument

    If possible, fetch the last run date time from the dataset passed in argument

    ds

    dataset

    spark

    a spark session

    returns

    the last run date or the minDateTime

  13. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. def load(data: Map[String, DataFrame], lastRunDateTime: LocalDateTime = minDateTime, currentRunDateTime: LocalDateTime = LocalDateTime.now(), repartition: (DataFrame) ⇒ DataFrame = defaultRepartition)(implicit spark: SparkSession): Map[String, DataFrame]

    Loads the output data into a persistent storage.

    Loads the output data into a persistent storage. The output destination can be any of: object store, database or flat files...

    data

    output data produced by the transform method.

    spark

    an instance of SparkSession

  16. def loadDataset(df: DataFrame, ds: DatasetConf, repartition: (DataFrame) ⇒ DataFrame)(implicit spark: SparkSession): DataFrame
  17. val log: Logger
  18. val maxDateTime: LocalDateTime
  19. val minDateTime: LocalDateTime
  20. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  23. def publish()(implicit spark: SparkSession): Unit

    OPTIONAL - Contains all actions needed to be done in order to make the data available to users like creating a view with the data.

    OPTIONAL - Contains all actions needed to be done in order to make the data available to users like creating a view with the data.

    spark

    an instance of SparkSession

  24. def replaceWhere: Option[String]

    replaceWhere is used in for OverWriteStaticPartition load.

    replaceWhere is used in for OverWriteStaticPartition load. It avoids to compute dataframe to infer which partitions to replace. Most of the time, these partitions can be inferred statically. Always prefer that to dynamically overwrite partitions.

  25. def reset()(implicit spark: SparkSession): Unit

    Reset the ETL by removing the destination dataset.

  26. def run(runSteps: Seq[RunStep] = RunStep.default_load, lastRunDateTime: Option[LocalDateTime] = None, currentRunDateTime: Option[LocalDateTime] = None)(implicit spark: SparkSession): Map[String, DataFrame]

    Entry point of the etl - execute this method in order to run the whole ETL

    Entry point of the etl - execute this method in order to run the whole ETL

    spark

    an instance of SparkSession

    Definition Classes
    ETLRunnable
  27. def sampling: PartialFunction[String, (DataFrame) ⇒ DataFrame]

    Logic used when the ETL is run as a SAMPLE_LOAD

  28. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  29. def toMain(df: ⇒ DataFrame): Map[String, DataFrame]
  30. def toString(): String
    Definition Classes
    AnyRef → Any
  31. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  33. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from Runnable

Inherited from AnyRef

Inherited from Any

Ungrouped