Packages

o

org.apache.spark.sql.delta

GeneratedColumn

object GeneratedColumn extends DeltaLogging with AnalysisHelper

Provide utility methods to implement Generated Columns for Delta. Users can use the following SQL syntax to create a table with generated columns.

CREATE TABLE table_identifier( column_name column_type, column_name column_type GENERATED ALWAYS AS ( generation_expr ), ... ) USING delta [ PARTITIONED BY (partition_column_name, ...) ]

This is an example: CREATE TABLE foo( id bigint, type string, subType string GENERATED ALWAYS AS ( SUBSTRING(type FROM 0 FOR 4) ), data string, eventTime timestamp, day date GENERATED ALWAYS AS ( days(eventTime) ) USING delta PARTITIONED BY (type, day)

When writing to a table, for these generated columns: - If the output is missing a generated column, we will add an expression to generate it. - If a generated column exists in the output, in other words, we will add a constraint to ensure the given value doesn't violate the generation expression.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GeneratedColumn
  2. AnalysisHelper
  3. DeltaLogging
  4. DatabricksLogging
  5. DeltaProgressReporter
  6. Logging
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val MIN_WRITER_VERSION: Int
  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. def enforcesGeneratedColumns(protocol: Protocol, metadata: Metadata): Boolean

    Whether the table has generated columns.

    Whether the table has generated columns. A table has generated columns only if its minWriterVersion >= GeneratedColumn.MIN_WRITER_VERSION and some of columns in the table schema contain generation expressions.

    As Spark will propagate column metadata storing the generation expression through the entire plan, old versions that don't support generated columns may create tables whose schema contain generation expressions. However, since these old versions has a lower writer version, we can use the table's minWriterVersion to identify such tables and treat them as normal tables.

    protocol

    the table protocol.

    metadata

    the table metadata.

  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  11. def generatePartitionFilters(spark: SparkSession, snapshot: Snapshot, dataFilters: Seq[Expression], delta: LogicalPlan): Seq[Expression]

    Try to generate partition filters from data filters if possible.

    Try to generate partition filters from data filters if possible.

    delta

    the logical plan that outputs the same attributes as the table schema. This will be used to resolve auto generated expressions.

  12. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. def getGeneratedColumns(snapshot: Snapshot): Seq[StructField]

    Returns the generated columns of a table.

    Returns the generated columns of a table. A column is a generated column requires: - The table writer protocol >= GeneratedColumn.MIN_WRITER_VERSION; - It has a generation expression in the column metadata.

  14. def getGeneratedColumnsAndColumnsUsedByGeneratedColumns(schema: StructType): Set[String]
  15. def getGenerationExpression(field: StructField): Option[Expression]

    Return the generation expression from a field if any.

    Return the generation expression from a field if any. This method doesn't check the protocl. The caller should make sure the table writer protocol meets satisfyGeneratedColumnProtocol before calling method.

  16. def getGenerationExpressionStr(metadata: Metadata): Option[String]

    Return the generation expression from a field metadata if any.

  17. def getOptimizablePartitionExpressions(schema: StructType, partitionSchema: StructType): Map[String, Seq[OptimizablePartitionExpression]]

    Try to get OptimizablePartitionExpressions of a data column when a partition column is defined as a generated column and refers to this data column.

    Try to get OptimizablePartitionExpressions of a data column when a partition column is defined as a generated column and refers to this data column.

    schema

    the table schema

    partitionSchema

    the partition schema. If a partition column is defined as a generated column, its column metadata should contain the generation expression.

  18. def hasGeneratedColumns(schema: StructType): Boolean

    Whether any generation expressions exist in the schema.

    Whether any generation expressions exist in the schema. Note: this doesn't mean the table contains generated columns. A table has generated columns only if its minWriterVersion >= GeneratedColumn.MIN_WRITER_VERSION and some of columns in the table schema contain generation expressions. Use enforcesGeneratedColumns to check generated column tables instead.

  19. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. def improveUnsupportedOpError(f: => Unit): Unit
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  21. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  22. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  23. def isGeneratedColumn(protocol: Protocol, field: StructField): Boolean

    Whether a column is a generated column.

  24. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  25. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  26. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  27. def logConsole(line: String): Unit
    Definition Classes
    DatabricksLogging
  28. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  32. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  35. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  39. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  40. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  41. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  42. def partitionFilterOptimizationEnabled(spark: SparkSession): Boolean
  43. def recordDeltaEvent(deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty, data: AnyRef = null, path: Option[Path] = None): Unit

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    Used to record the occurrence of a single event or report detailed, operation specific statistics.

    path

    Used to log the path of the delta table when deltaLog is null.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  44. def recordDeltaOperation[A](deltaLog: DeltaLog, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Used to report the duration as well as the success or failure of an operation on a deltaLog.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  45. def recordDeltaOperationForTablePath[A](tablePath: String, opType: String, tags: Map[TagDefinition, String] = Map.empty)(thunk: => A): A

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Used to report the duration as well as the success or failure of an operation on a tahoePath.

    Attributes
    protected
    Definition Classes
    DeltaLogging
  46. def recordEvent(metric: MetricDefinition, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  47. def recordFrameProfile[T](group: String, name: String)(thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  48. def recordOperation[S](opType: OpType, opTarget: String = null, extraTags: Map[TagDefinition, String], isSynchronous: Boolean = true, alwaysRecordStats: Boolean = false, allowAuthTags: Boolean = false, killJvmIfStuck: Boolean = false, outputMetric: MetricDefinition = null, silent: Boolean = true)(thunk: => S): S
    Definition Classes
    DatabricksLogging
  49. def recordProductEvent(metric: MetricDefinition with CentralizableMetric, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, trimBlob: Boolean = true): Unit
    Definition Classes
    DatabricksLogging
  50. def recordProductUsage(metric: MetricDefinition with CentralizableMetric, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  51. def recordUsage(metric: MetricDefinition, quantity: Double, additionalTags: Map[TagDefinition, String] = Map.empty, blob: String = null, forceSample: Boolean = false, trimBlob: Boolean = true, silent: Boolean = false): Unit
    Definition Classes
    DatabricksLogging
  52. def resolveReferencesForExpressions(sparkSession: SparkSession, exprs: Seq[Expression], planProvidingAttrs: LogicalPlan): Seq[Expression]

    Resolve expressions using the attributes provided by planProvidingAttrs.

    Resolve expressions using the attributes provided by planProvidingAttrs. Throw an error if failing to resolve any expressions.

    Attributes
    protected
    Definition Classes
    AnalysisHelper
  53. def satisfyGeneratedColumnProtocol(protocol: Protocol): Boolean
  54. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  55. def toDataset(sparkSession: SparkSession, logicalPlan: LogicalPlan): Dataset[Row]
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  56. def toString(): String
    Definition Classes
    AnyRef → Any
  57. def tryResolveReferences(sparkSession: SparkSession)(expr: Expression, planContainingExpr: LogicalPlan): Expression
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  58. def tryResolveReferencesForExpressions(sparkSession: SparkSession, exprs: Seq[Expression], planContainingExpr: LogicalPlan): Seq[Expression]
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  59. def validateGeneratedColumns(spark: SparkSession, schema: StructType): Unit

    If the schema contains generated columns, check the following unsupported cases: - Refer to a non-existent column or another generated column.

    If the schema contains generated columns, check the following unsupported cases: - Refer to a non-existent column or another generated column. - Use an unsupported expression. - The expression type is not the same as the column type.

  60. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  61. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  62. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  63. def withDmqTag[T](thunk: => T): T
    Attributes
    protected
    Definition Classes
    DeltaLogging
  64. def withStatusCode[T](statusCode: String, defaultMessage: String, data: Map[String, Any] = Map.empty)(body: => T): T

    Report a log to indicate some command is running.

    Report a log to indicate some command is running.

    Definition Classes
    DeltaProgressReporter

Inherited from AnalysisHelper

Inherited from DeltaLogging

Inherited from DatabricksLogging

Inherited from DeltaProgressReporter

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped