Packages

t

com.nvidia.spark.rapids

SparkShims

trait SparkShims extends AnyRef

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkShims
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def ansiCastRule: ExprRule[_ <: Expression]

    Return the replacement rule for AnsiCast.

    Return the replacement rule for AnsiCast. 'AnsiCast' is removed from Spark 3.4.0, so need to handle it separately.

  2. abstract def aqeShuffleReaderExec: ExecRule[_ <: SparkPlan]
  3. abstract def attachTreeIfSupported[TreeType <: TreeNode[_], A](tree: TreeType, msg: String = "")(f: ⇒ A): A

    dropped by SPARK-34234

  4. abstract def avroRebaseReadKey: String
  5. abstract def avroRebaseWriteKey: String
  6. abstract def broadcastModeTransform(mode: BroadcastMode, toArray: Array[InternalRow]): Any
  7. abstract def columnarAdaptivePlan(a: AdaptiveSparkPlanExec, goal: CoalesceSizeGoal): SparkPlan
  8. abstract def filesFromFileIndex(fileCatalog: PartitioningAwareFileIndex): Seq[FileStatus]
  9. abstract def findOperators(plan: SparkPlan, predicate: (SparkPlan) ⇒ Boolean): Seq[SparkPlan]

    Walk the plan recursively and return a list of operators that match the predicate

  10. abstract def getAdaptiveInputPlan(adaptivePlan: AdaptiveSparkPlanExec): SparkPlan
  11. abstract def getDataWriteCmds: Map[Class[_ <: DataWritingCommand], DataWritingCommandRule[_ <: DataWritingCommand]]
  12. abstract def getDateFormatter(): DateFormatter
  13. abstract def getExecs: Map[Class[_ <: SparkPlan], ExecRule[_ <: SparkPlan]]
  14. abstract def getExprs: Map[Class[_ <: Expression], ExprRule[_ <: Expression]]
  15. abstract def getFileScanRDD(sparkSession: SparkSession, readFunction: (PartitionedFile) ⇒ Iterator[InternalRow], filePartitions: Seq[FilePartition], readDataSchema: StructType, metadataColumns: Seq[AttributeReference] = Seq.empty, fileFormat: Option[FileFormat] = None): RDD[InternalRow]
  16. abstract def getParquetFilters(schema: MessageType, pushDownDate: Boolean, pushDownTimestamp: Boolean, pushDownDecimal: Boolean, pushDownStartWith: Boolean, pushDownInFilterThreshold: Int, caseSensitive: Boolean, lookupFileMeta: (String) ⇒ String, dateTimeRebaseModeFromConf: String): ParquetFilters
  17. abstract def getRunnableCmds: Map[Class[_ <: RunnableCommand], RunnableCommandRule[_ <: RunnableCommand]]
  18. abstract def getScans: Map[Class[_ <: Scan], ScanRule[_ <: Scan]]
  19. abstract def hasAliasQuoteFix: Boolean
  20. abstract def hasCastFloatTimestampUpcast: Boolean
  21. abstract def int96ParquetRebaseRead(conf: SQLConf): String
  22. abstract def int96ParquetRebaseReadKey: String
  23. abstract def int96ParquetRebaseWrite(conf: SQLConf): String
  24. abstract def int96ParquetRebaseWriteKey: String
  25. abstract def isAqePlan(p: SparkPlan): Boolean
  26. abstract def isCustomReaderExec(x: SparkPlan): Boolean
  27. abstract def isEmptyRelation(relation: Any): Boolean
  28. abstract def isExchangeOp(plan: SparkPlanMeta[_]): Boolean
  29. abstract def isWindowFunctionExec(plan: SparkPlan): Boolean
  30. abstract def leafNodeDefaultParallelism(ss: SparkSession): Int
  31. abstract def neverReplaceShowCurrentNamespaceCommand: ExecRule[_ <: SparkPlan]
  32. abstract def newBroadcastQueryStageExec(old: BroadcastQueryStageExec, newPlan: SparkPlan): BroadcastQueryStageExec
  33. abstract def parquetRebaseRead(conf: SQLConf): String
  34. abstract def parquetRebaseReadKey: String
  35. abstract def parquetRebaseWrite(conf: SQLConf): String
  36. abstract def parquetRebaseWriteKey: String
  37. abstract def reproduceEmptyStringBug: Boolean

    Handle regexp_replace inconsistency from https://issues.apache.org/jira/browse/SPARK-39107

  38. abstract def reusedExchangeExecPfn: PartialFunction[SparkPlan, ReusedExchangeExec]
  39. abstract def sessionFromPlan(plan: SparkPlan): SparkSession
  40. abstract def shouldFailDivOverflow: Boolean
  41. abstract def skipAssertIsOnTheGpu(plan: SparkPlan): Boolean

    Our tests, by default, will check that all operators are running on the GPU, but there are some operators that we do not translate to GPU plans, so we need a way to bypass the check for those.

  42. abstract def supportsColumnarAdaptivePlans: Boolean

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec.

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec. This feature was introduced in Spark 3.2 as part of SPARK-35881.

  43. abstract def tryTransformIfEmptyRelation(mode: BroadcastMode): Option[Any]

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    In Spark 3.2.0, the optimization is still performed (under AQEPropagateEmptyRelation), but the AQE optimizer is looking at the metrics for the query stage to determine if numRows == 0, and if so it can eliminate certain joins.

    The call is implemented only for Spark 3.1.x+. It is disabled in Databricks because it requires a task context to perform the BroadcastMode.transform call, but we'd like to call this from the driver.

  44. abstract def v1RepairTableCommand(tableName: TableIdentifier): RunnableCommand

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def addExecBroadcastShuffle(p: SparkPlan): SparkPlan

    If the shim doesn't support executor broadcast, just return the plan passed in

  5. def addRowShuffleToQueryStageTransitionIfNeeded(c2r: ColumnarToRowTransition, sqse: ShuffleQueryStageExec): SparkPlan

    Adds a row-based shuffle to the transititonal shuffle query stage if needed.

    Adds a row-based shuffle to the transititonal shuffle query stage if needed. This is needed when AQE plans a GPU shuffleexchange to be reused by a parent plan exec that consumes rows

  6. def applyPostShimPlanRules(plan: SparkPlan): SparkPlan
  7. def applyShimPlanRules(plan: SparkPlan, conf: RapidsConf): SparkPlan
  8. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  9. def checkCToRWithExecBroadcastAQECoalPart(p: SparkPlan, parent: Option[SparkPlan]): Boolean
  10. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  11. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  15. def getShuffleFromCToRWithExecBroadcastAQECoalPart(p: SparkPlan): Option[SparkPlan]
  16. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  17. def isCastingStringToNegDecimalScaleSupported: Boolean
  18. def isExecutorBroadcastShuffle(shuffle: ShuffleExchangeLike): Boolean
  19. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  20. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  23. def shuffleParentReadsShuffleData(shuffle: ShuffleExchangeLike, parent: SparkPlan): Boolean
  24. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  25. def toString(): String
    Definition Classes
    AnyRef → Any
  26. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped