Packages

c

bio.ferlab.datalake.spark3.etl.v3

TransformationsETL

class TransformationsETL[T <: Configuration] extends SingleETL[T]

Annotations
@deprecated
Deprecated

(Since version 11.0.0) use v4.TransformationsETL instead

Linear Supertypes
SingleETL[T], ETL[T], AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TransformationsETL
  2. SingleETL
  3. ETL
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TransformationsETL(context: DeprecatedETLContext[T], source: DatasetConf, mainDestination: DatasetConf, transformations: List[Transformation])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  6. implicit val conf: Configuration
    Definition Classes
    ETL
  7. def defaultRepartition: (DataFrame) ⇒ DataFrame
    Definition Classes
    ETL
  8. def defaultSampling: PartialFunction[String, (DataFrame) ⇒ DataFrame]
    Definition Classes
    ETL
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def extract(lastRunDateTime: LocalDateTime, currentRunDateTime: LocalDateTime): Map[String, DataFrame]

    Reads data from a file system and produces a Map[DatasetConf, DataFrame].

    Reads data from a file system and produces a Map[DatasetConf, DataFrame]. This method should avoid transformation and joins but can implement filters in order to make the ETL more efficient.

    returns

    all the data needed to pass to the transform method and produce the desired output.

    Definition Classes
    TransformationsETLETL
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  13. def getLastRunDateFor(ds: DatasetConf): LocalDateTime

    If possible, fetch the last run date time from the dataset passed in argument

    If possible, fetch the last run date time from the dataset passed in argument

    ds

    dataset

    returns

    the last run date or the minDateTime

    Definition Classes
    ETL
  14. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  15. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  16. final def load(data: Map[String, DataFrame], lastRunDateTime: LocalDateTime, currentRunDateTime: LocalDateTime): Map[String, DataFrame]

    Loads the output data into a persistent storage.

    Loads the output data into a persistent storage. The output destination can be any of: object store, database or flat files...

    data

    output data produced by the transform method.

    Definition Classes
    SingleETLETL
  17. def loadDataset(df: DataFrame, ds: DatasetConf): DataFrame
    Definition Classes
    ETL
  18. def loadSingle(data: DataFrame, lastRunDateTime: LocalDateTime = minDateTime, currentRunDateTime: LocalDateTime = LocalDateTime.now()): DataFrame
    Definition Classes
    SingleETL
  19. val log: Logger
    Definition Classes
    ETL
  20. val mainDestination: DatasetConf
    Definition Classes
    TransformationsETLETL
  21. val maxDateTime: LocalDateTime
    Definition Classes
    ETL
  22. val minDateTime: LocalDateTime
    Definition Classes
    ETL
  23. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  24. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  25. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  26. def publish(): Unit

    OPTIONAL - Contains all actions needed to be done in order to make the data available to users like creating a view with the data.

    OPTIONAL - Contains all actions needed to be done in order to make the data available to users like creating a view with the data.

    Definition Classes
    ETL
  27. def replaceWhere: Option[String]

    replaceWhere is used in for OverWriteStaticPartition load.

    replaceWhere is used in for OverWriteStaticPartition load. It avoids to compute dataframe to infer which partitions to replace. Most of the time, these partitions can be inferred statically. Always prefer that to dynamically overwrite partitions.

    Definition Classes
    ETL
  28. def reset(): Unit

    Reset the ETL by removing the destination dataset.

    Reset the ETL by removing the destination dataset.

    Definition Classes
    ETL
  29. def run(lastRunDateTime: Option[LocalDateTime] = None, currentRunDateTime: Option[LocalDateTime] = None): Map[String, DataFrame]

    Entry point of the etl - execute this method in order to run the whole ETL

    Entry point of the etl - execute this method in order to run the whole ETL

    Definition Classes
    ETL
  30. def sampling: PartialFunction[String, (DataFrame) ⇒ DataFrame]

    Logic used when the ETL is run as a RunStep.sample step.

    Logic used when the ETL is run as a RunStep.sample step.

    Definition Classes
    ETL
  31. val source: DatasetConf
  32. implicit val spark: SparkSession
    Definition Classes
    ETL
  33. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  34. def toMain(df: ⇒ DataFrame): Map[String, DataFrame]
    Definition Classes
    ETL
  35. def toString(): String
    Definition Classes
    AnyRef → Any
  36. final def transform(data: Map[String, DataFrame], lastRunDateTime: LocalDateTime = minDateTime, currentRunDateTime: LocalDateTime = LocalDateTime.now()): Map[String, DataFrame]

    Takes a Map[DatasetConf, DataFrame] as input and applies a set of transformations to it to produce the ETL output.

    Takes a Map[DatasetConf, DataFrame] as input and applies a set of transformations to it to produce the ETL output. It is recommended to not read any additional data but to use the extract() method instead to inject input data.

    data

    input data

    Definition Classes
    SingleETLETL
  37. def transformSingle(data: Map[String, DataFrame], lastRunDateTime: LocalDateTime, currentRunDateTime: LocalDateTime): DataFrame

    Takes a DataFrame as input and applies a set of transformations to it to produce the ETL output.

    Takes a DataFrame as input and applies a set of transformations to it to produce the ETL output. It is recommended to not read any additional data but to use the extract() method instead to inject input data.

    data

    input data

    Definition Classes
    TransformationsETLSingleETL
  38. val transformations: List[Transformation]
  39. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  41. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from SingleETL[T]

Inherited from ETL[T]

Inherited from AnyRef

Inherited from Any

Ungrouped