case class CustomFileDataObject(id: DataObjectId, creator: CustomFileCreatorConfig, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry) extends DataObject with FileRefDataObject with CanCreateInputStream with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, CanCreateInputStream, FileRefDataObject, FileDataObject, CanHandlePartitions, DataObject, AtlasExportable, SmartDataLakeLogger, ParsableFromConfig[DataObject], SdlConfigObject, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. CustomFileDataObject
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. CanCreateInputStream
  7. FileRefDataObject
  8. FileDataObject
  9. CanHandlePartitions
  10. DataObject
  11. AtlasExportable
  12. SmartDataLakeLogger
  13. ParsableFromConfig
  14. SdlConfigObject
  15. AnyRef
  16. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CustomFileDataObject(id: DataObjectId, creator: CustomFileCreatorConfig, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry)

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def atlasName: String
    Definition Classes
    DataObjectAtlasExportable
  6. def atlasQualifiedName(prefix: String): String
    Definition Classes
    AtlasExportable
  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  8. def createInputStream(path: String)(implicit context: ActionPipelineContext): InputStream
    Definition Classes
    CustomFileDataObject → CanCreateInputStream
  9. val creator: CustomFileCreatorConfig
  10. def deleteAll(implicit context: ActionPipelineContext): Unit

    Delete all data.

    Delete all data. This is used to implement SaveMode.Overwrite.

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  11. def deleteFileRefs(fileRefs: Seq[FileRef])(implicit context: ActionPipelineContext): Unit

    Delete given files.

    Delete given files. This is used to cleanup files after they are processed.

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def expectedPartitionsCondition: Option[String]

    Definition of partitions that are expected to exists.

    Definition of partitions that are expected to exists. This is used to validate that partitions being read exists and don't return no data. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false example: "elements['yourColName'] > 2017"

    returns

    true if partition is expected to exist.

    Definition Classes
    CustomFileDataObjectCanHandlePartitions
  14. def extractPartitionValuesFromPath(filePath: String)(implicit context: ActionPipelineContext): PartitionValues

    Extract partition values from a given file path

    Extract partition values from a given file path

    Attributes
    protected
    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  15. def factory: FromConfigFactory[DataObject]

    Returns the factory that can parse this type (that is, type CO).

    Returns the factory that can parse this type (that is, type CO).

    Typically, implementations of this method should return the companion object of the implementing class. The companion object in turn should implement FromConfigFactory.

    returns

    the factory (object) for this class.

    Definition Classes
    CustomFileDataObject → ParsableFromConfig
  16. val fileName: String

    Definition of fileName.

    Definition of fileName. Default is an asterix to match everything. This is concatenated with the partition layout to search for files.

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  18. def getConnection[T <: Connection](connectionId: ConnectionId)(implicit registry: InstanceRegistry, ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T

    Handle class cast exception when getting objects from instance registry

    Handle class cast exception when getting objects from instance registry

    Attributes
    protected
    Definition Classes
    DataObject
    Annotations
    @Scaladoc()
  19. def getConnectionReg[T <: Connection](connectionId: ConnectionId, registry: InstanceRegistry)(implicit ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T
    Attributes
    protected
    Definition Classes
    DataObject
  20. def getFileRefs(partitionValues: Seq[PartitionValues])(implicit context: ActionPipelineContext): Seq[FileRef]

    List files for given partition values

    List files for given partition values

    partitionValues

    List of partition values to be filtered. If empty all files in root path of DataObject will be listed.

    returns

    List of FileRefs

    Definition Classes
    CustomFileDataObject → FileRefDataObject
  21. def getPartitionString(partitionValues: PartitionValues)(implicit context: ActionPipelineContext): Option[String]

    get partition values formatted by partition layout

    get partition values formatted by partition layout

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  22. def getPath(implicit context: ActionPipelineContext): String

    Method for subclasses to override the base path for this DataObject.

    Method for subclasses to override the base path for this DataObject. This is for instance needed if pathPrefix is defined in a connection.

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  23. def getSearchPaths(partitionValues: Seq[PartitionValues])(implicit context: ActionPipelineContext): Seq[(PartitionValues, String)]

    prepare paths to be searched

    prepare paths to be searched

    Attributes
    protected
    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  24. def housekeepingMode: Option[HousekeepingMode]

    Configure a housekeeping mode to e.g cleanup, archive and compact partitions.

    Configure a housekeeping mode to e.g cleanup, archive and compact partitions. Default is None.

    Definition Classes
    DataObject
    Annotations
    @Scaladoc()
  25. val id: DataObjectId

    A unique identifier for this instance.

    A unique identifier for this instance.

    Definition Classes
    CustomFileDataObjectDataObject → SdlConfigObject
  26. implicit val instanceRegistry: InstanceRegistry
  27. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  28. def listPartitions(implicit context: ActionPipelineContext): Seq[PartitionValues]

    list partition values

    list partition values

    Definition Classes
    CustomFileDataObjectCanHandlePartitions
  29. lazy val logger: Logger
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
    Annotations
    @transient()
  30. val metadata: Option[DataObjectMetadata]

    Additional metadata for the DataObject

    Additional metadata for the DataObject

    Definition Classes
    CustomFileDataObjectDataObject
  31. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  32. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  33. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  34. def partitionLayout(): Option[String]

    Definition of partition layout use %<partitionColName>% as placeholder and * for globs in layout Note: if you have globs in partition layout, it's not possible to write files to this DataObject Note: if this is a directory, you must add a final backslash to the partition layout

    Definition of partition layout use %<partitionColName>% as placeholder and * for globs in layout Note: if you have globs in partition layout, it's not possible to write files to this DataObject Note: if this is a directory, you must add a final backslash to the partition layout

    Definition Classes
    CustomFileDataObject → FileRefDataObject
  35. def partitions: Seq[String]

    Definition of partition columns

    Definition of partition columns

    Definition Classes
    CustomFileDataObjectCanHandlePartitions
  36. def path: String

    The root path of the files that are handled by this DataObject.

    The root path of the files that are handled by this DataObject.

    Definition Classes
    CustomFileDataObject → FileDataObject
  37. def prepare(implicit context: ActionPipelineContext): Unit

    Prepare & test DataObject's prerequisits

    Prepare & test DataObject's prerequisits

    This runs during the "prepare" operation of the DAG.

    Definition Classes
    FileDataObject → DataObject
  38. def relativizePath(filePath: String)(implicit context: ActionPipelineContext): String

    Make a given path relative to this DataObjects base path

    Make a given path relative to this DataObjects base path

    Definition Classes
    CustomFileDataObject → FileDataObject
  39. def saveMode: SDLSaveMode

    Overwrite or Append new data.

    Overwrite or Append new data. When writing partitioned data, this applies only to partitions concerned.

    Definition Classes
    CustomFileDataObject → FileRefDataObject
  40. val separator: Char

    default separator for paths

    default separator for paths

    Attributes
    protected
    Definition Classes
    FileDataObject
    Annotations
    @Scaladoc()
  41. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  42. def toStringShort: String
    Definition Classes
    DataObject
  43. def translateFileRefs(fileRefs: Seq[FileRef])(implicit context: ActionPipelineContext): Seq[FileRefMapping]

    Given some FileRef for another DataObject, translate the paths to the root path of this DataObject

    Given some FileRef for another DataObject, translate the paths to the root path of this DataObject

    Definition Classes
    FileRefDataObject
    Annotations
    @Scaladoc()
  44. def validateSchemaHasPartitionCols(df: DataFrame, role: String): Unit

    Validate the schema of a given Spark Data Frame df that it contains the specified partition columns

    Validate the schema of a given Spark Data Frame df that it contains the specified partition columns

    df

    The data frame to validate.

    role

    role used in exception message. Set to read or write.

    Definition Classes
    CanHandlePartitions
    Annotations
    @Scaladoc()
    Exceptions thrown

    SchemaViolationException if the partitions columns are not included.

  45. def validateSchemaHasPrimaryKeyCols(df: DataFrame, primaryKeyCols: Seq[String], role: String): Unit

    Validate the schema of a given Spark Data Frame df that it contains the specified primary key columns

    Validate the schema of a given Spark Data Frame df that it contains the specified primary key columns

    df

    The data frame to validate.

    role

    role used in exception message. Set to read or write.

    Definition Classes
    CanHandlePartitions
    Annotations
    @Scaladoc()
    Exceptions thrown

    SchemaViolationException if the partitions columns are not included.

  46. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  47. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  48. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from CanCreateInputStream

Inherited from FileRefDataObject

Inherited from FileDataObject

Inherited from CanHandlePartitions

Inherited from DataObject

Inherited from AtlasExportable

Inherited from SmartDataLakeLogger

Inherited from ParsableFromConfig[DataObject]

Inherited from SdlConfigObject

Inherited from AnyRef

Inherited from Any

Ungrouped