Packages

trait DeltaSourceBase extends Source with SupportsAdmissionControl with DeltaLogging

Base trait for the Delta Source, that contains methods that deal with getting changes from the delta log.

Self Type
DeltaSource
Linear Supertypes
DeltaLogging, DatabricksLogging, DeltaProgressReporter, Logging, SupportsAdmissionControl, Source, SparkDataStream, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DeltaSourceBase
  2. DeltaLogging
  3. DatabricksLogging
  4. DeltaProgressReporter
  5. Logging
  6. SupportsAdmissionControl
  7. Source
  8. SparkDataStream
  9. AnyRef
  10. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Abstract Value Members

  1. abstract def getBatch(start: Option[Offset], end: Offset): DataFrame
    Definition Classes
    Source
  2. abstract def getOffset: Option[Offset]
    Definition Classes
    Source
  3. abstract def latestOffset(arg0: Offset, arg1: ReadLimit): Offset
    Definition Classes
    SupportsAdmissionControl
  4. abstract def stop(): Unit
    Definition Classes
    SparkDataStream

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def cleanUpSnapshotResources(): Unit
    Attributes
    protected
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. def commit(end: Offset): Unit
    Definition Classes
    Source → SparkDataStream
  8. def commit(end: Offset): Unit
    Definition Classes
    Source
  9. def createDataFrame(indexedFiles: Iterator[IndexedFile]): DataFrame

    Given an iterator of file actions, create a DataFrame representing the files added to a table Only AddFile actions will be used to create the DataFrame.

    Given an iterator of file actions, create a DataFrame representing the files added to a table Only AddFile actions will be used to create the DataFrame.

    indexedFiles

    actions iterator from which to generate the DataFrame.

    Attributes
    protected
  10. def createDataFrameBetweenOffsets(startVersion: Long, startIndex: Long, isStartingVersion: Boolean, startSourceVersion: Option[Long], startOffsetOption: Option[Offset], endOffset: DeltaSourceOffset): DataFrame

    Return the DataFrame between start and end offset.

    Return the DataFrame between start and end offset.

    Attributes
    protected
  11. def deserializeOffset(json: String): Offset
    Definition Classes
    Source → SparkDataStream
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  15. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  16. def getDefaultReadLimit(): ReadLimit
    Definition Classes
    SupportsAdmissionControl
  17. def getFileChangesAndCreateDataFrame(startVersion: Long, startIndex: Long, isStartingVersion: Boolean, endOffset: DeltaSourceOffset): DataFrame

    get the changes from startVersion, startIndex to the end

    get the changes from startVersion, startIndex to the end

    startVersion

    - calculated starting version

    startIndex

    - calculated starting index

    isStartingVersion

    - whether the stream has to return the initial snapshot or not

    endOffset

    - Offset that signifies the end of the stream.

    Attributes
    protected
  18. def getFileChangesWithRateLimit(fromVersion: Long, fromIndex: Long, isStartingVersion: Boolean, limits: Option[AdmissionLimits] = Some(new AdmissionLimits())): ClosableIterator[IndexedFile]
    Attributes
    protected
  19. def getNextOffsetFromPreviousOffset(previousOffset: DeltaSourceOffset, limits: Option[AdmissionLimits]): Option[Offset]

    Return the next offset when previous offset exists.

    Return the next offset when previous offset exists.

    Attributes
    protected
  20. def getStartingOffsetFromSpecificDeltaVersion(fromVersion: Long, isStartingVersion: Boolean, limits: Option[AdmissionLimits]): Option[Offset]

    Returns the offset that starts from a specific delta table version.

    Returns the offset that starts from a specific delta table version. This function is called when starting a new stream query.

    fromVersion

    The version of the delta table to calculate the offset from.

    isStartingVersion

    Whether the delta version is for the initial snapshot or not.

    limits

    Indicates how much data can be processed by a micro batch.

    Attributes
    protected
  21. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  22. def initialOffset(): Offset
    Definition Classes
    Source → SparkDataStream
  23. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  24. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  25. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  26. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  27. val lastOffsetForTriggerAvailableNow: DeltaSourceOffset
    Attributes
    protected
  28. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  29. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  30. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  32. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  37. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  39. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  40. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  42. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  43. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  44. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  45. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  46. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  47. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  48. def recordFrameProfile[T](group: String, name: String)(thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  49. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = null, silent: Boolean = true)(thunk: => S): S
    Definition Classes
    DatabricksLogging
  50. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  51. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  52. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  53. def reportLatestOffset(): Offset
    Definition Classes
    SupportsAdmissionControl
  54. val schema: StructType
    Definition Classes
    DeltaSourceBase → Source
  55. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  56. def toString(): String
    Definition Classes
    AnyRef → Any
  57. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  58. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  59. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  60. def withDmqTag[T](thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  61. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: => T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from Logging

Inherited from SupportsAdmissionControl

Inherited from Source

Inherited from SparkDataStream

Inherited from AnyRef

Inherited from Any

Ungrouped