object VacuumCommand extends VacuumCommandImpl with Serializable

Vacuums the table by clearing all untracked files and folders within this table. First lists all the files and directories in the table, and gets the relative paths with respect to the base of the table. Then it gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. VacuumCommand
  2. Serializable
  3. VacuumCommandImpl
  4. DeltaCommand
  5. DeltaLogging
  6. DatabricksLogging
  7. DeltaProgressReporter
  8. Logging
  9. AnyRef
  10. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def allValidFiles(file: String, isBloomFiltered: Boolean): Seq[String]

    This is used to create the list of files we want to retain during GC.

    This is used to create the list of files we want to retain during GC.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def buildBaseRelation(spark: SparkSession, txn: OptimisticTransaction, actionType: String, rootPath: Path, inputLeafFiles: Seq[String], nameToAddFileMap: Map[String, AddFile]): HadoopFsRelation

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Build a base relation of files that need to be rewritten as part of an update/delete/merge operation.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  7. def checkRetentionPeriodSafety(spark: SparkSession, retentionMs: Option[Long], configuredRetention: Long): Unit

    Additional check on retention duration to prevent people from shooting themselves in the foot.

    Additional check on retention duration to prevent people from shooting themselves in the foot.

    Attributes
    protected
  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  9. def commitLarge(spark: SparkSession, txn: OptimisticTransaction, actions: Iterator[Action], op: Operation, context: Map[String, String], metrics: Map[String, String]): Long

    Create a large commit on the Delta log by directly writing an iterator of FileActions to the LogStore.

    Create a large commit on the Delta log by directly writing an iterator of FileActions to the LogStore. This function only commits the next possible version and will not check whether the commit is retry-able. If the next version has already been committed, then this function will fail. This bypasses all optimistic concurrency checks. We assume that transaction conflicts should be rare because this method is typically used to create new tables (e.g. CONVERT TO DELTA) or apply some commands which rarely receive other transactions (e.g. CLONE/RESTORE).

    Attributes
    protected
    Definition Classes
    DeltaCommand
  10. def delete(diff: Dataset[String], spark: SparkSession, basePath: String, hadoopConf: Broadcast[SerializableConfiguration], parallel: Boolean, parallelPartitions: Int): Long

    Attempts to delete the list of candidate files.

    Attempts to delete the list of candidate files. Returns the number of files deleted.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  11. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  12. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  14. def gc(spark: SparkSession, deltaLog: DeltaLog, dryRun: Boolean = true, retentionHours: Option[Double] = None, clock: Clock = new SystemClock): DataFrame

    Clears all untracked files and folders within this table.

    Clears all untracked files and folders within this table. First lists all the files and directories in the table, and gets the relative paths with respect to the base of the table. Then it gets the list of all tracked files for this table, which may or may not be within the table base path, and gets the relative paths of all the tracked files with respect to the base of the table. Files outside of the table path will be ignored. Then we take a diff of the files and delete directories that were already empty, and all files that are within the table that are no longer tracked.

    dryRun

    If set to true, no files will be deleted. Instead, we will list all files and directories that will be cleared.

    retentionHours

    An optional parameter to override the default Delta tombstone retention period

    returns

    A Dataset containing the paths of the files/folders to delete in dryRun mode. Otherwise returns the base path of the table.

  15. def generateCandidateFileMap(basePath: Path, candidateFiles: Seq[AddFile]): Map[String, AddFile]

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update.

    Generates a map of file names to add file entries for operations where we will need to rewrite files such as delete, merge, update. We expect file names to be unique, because each file contains a UUID.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  16. def getAllSubdirs(base: String, file: String, fs: FileSystem): Iterator[String]

    Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that file has with respect to base.

    Wrapper function for DeltaFileOperations.getAllSubDirectories returns all subdirectories that file has with respect to base.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  17. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. def getDeltaLog(spark: SparkSession, path: Option[String], tableIdentifier: Option[TableIdentifier], operationName: String): DeltaLog

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or

    Utility method to return the DeltaLog of an existing Delta table referred by either the given path or

    spark

    SparkSession reference to use.

    path

    Table location. Expects a non-empty tableIdentifier or path.

    tableIdentifier

    Table identifier. Expects a non-empty tableIdentifier or path.

    operationName

    Operation that is getting the DeltaLog, used in error messages.

    returns

    DeltaLog of the table

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException If either no Delta table exists at the given path/identifier or there is neither path nor tableIdentifier is provided.

  19. def getTouchedFile(basePath: Path, filePath: String, nameToAddFileMap: Map[String, AddFile]): AddFile

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    Find the AddFile record corresponding to the file that was read as part of a delete/update/merge operation.

    filePath

    The path to a file. Can be either absolute or relative

    nameToAddFileMap

    Map generated through generateCandidateFileMap()

    Attributes
    protected
    Definition Classes
    DeltaCommand
  20. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  21. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  22. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  23. def isCatalogTable(analyzer: Analyzer, tableIdent: TableIdentifier): Boolean

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    Use the analyzer to see whether the provided TableIdentifier is for a path based table or not

    analyzer

    The session state analyzer to call

    tableIdent

    Table Identifier to determine whether is path based or not

    returns

    Boolean where true means that the table is a table in a metastore and false means the table is a path based table

    Definition Classes
    DeltaCommand
  24. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  25. def isPathIdentifier(tableIdent: TableIdentifier): Boolean

    Checks if the given identifier can be for a delta table's path

    Checks if the given identifier can be for a delta table's path

    tableIdent

    Table Identifier for which to check

    Attributes
    protected
    Definition Classes
    DeltaCommand
  26. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  27. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  28. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  29. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  32. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  36. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logVacuumEnd(deltaLog: DeltaLog, spark: SparkSession, path: Path, filesDeleted: Option[Long] = None, dirCounts: Option[Long] = None): Unit
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  39. def logVacuumStart(spark: SparkSession, deltaLog: DeltaLog, path: Path, diff: Dataset[String], specifiedRetentionMillis: Option[Long], defaultRetentionMillis: Long): Unit
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  40. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  42. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  43. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  44. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  45. def parsePredicates(spark: SparkSession, predicate: String): Seq[Expression]

    Converts string predicates into Expressions relative to a transaction.

    Converts string predicates into Expressions relative to a transaction.

    Attributes
    protected
    Definition Classes
    DeltaCommand
    Exceptions thrown

    AnalysisException if a non-partition column is referenced.

  46. def pathToString(path: Path): String
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  47. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  48. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  49. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  50. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  51. def recordFrameProfile[T](group: String, name: String)(thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  52. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = null, silent: Boolean = true)(thunk: => S): S
    Definition Classes
    DatabricksLogging
  53. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  54. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  55. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  56. def relativize(path: Path, fs: FileSystem, reservoirBase: Path, isDir: Boolean): String

    Attempts to relativize the path with respect to the reservoirBase and converts the path to a string.

    Attempts to relativize the path with respect to the reservoirBase and converts the path to a string.

    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  57. def removeFilesFromPaths(deltaLog: DeltaLog, nameToAddFileMap: Map[String, AddFile], filesToRewrite: Seq[String], operationTimestamp: Long): Seq[RemoveFile]

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    This method provides the RemoveFile actions that are necessary for files that are touched and need to be rewritten in methods like Delete, Update, and Merge.

    deltaLog

    The DeltaLog of the table that is being operated on

    nameToAddFileMap

    A map generated using generateCandidateFileMap.

    filesToRewrite

    Absolute paths of the files that were touched. We will search for these in candidateFiles. Obtained as the output of the input_file_name function.

    operationTimestamp

    The timestamp of the operation

    Attributes
    protected
    Definition Classes
    DeltaCommand
  58. def resolveIdentifier(analyzer: Analyzer, identifier: TableIdentifier): LogicalPlan

    Use the analyzer to resolve the identifier provided

    Use the analyzer to resolve the identifier provided

    analyzer

    The session state analyzer to call

    identifier

    Table Identifier to determine whether is path based or not

    Attributes
    protected
    Definition Classes
    DeltaCommand
  59. def stringToPath(path: String): Path
    Attributes
    protected
    Definition Classes
    VacuumCommandImpl
  60. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  61. def toString(): String
    Definition Classes
    AnyRef → Any
  62. def updateAndCheckpoint(spark: SparkSession, deltaLog: DeltaLog, commitSize: Int, attemptVersion: Long, txnId: String): Snapshot

    Update the table now that the commit has been made, and write a checkpoint.

    Update the table now that the commit has been made, and write a checkpoint.

    Attributes
    protected
    Definition Classes
    DeltaCommand
  63. def verifyPartitionPredicates(spark: SparkSession, partitionColumns: Seq[String], predicates: Seq[Expression]): Unit
    Definition Classes
    DeltaCommand
  64. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  65. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  66. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  67. def withDmqTag[T](thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  68. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: => T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter

Inherited from Serializable

Inherited from VacuumCommandImpl

Inherited from DeltaCommand

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped