class CustomDfCreatorWrapper extends CustomDfCreator
Linear Supertypes
Ordering
- Alphabetic
- By Inheritance
Inherited
- CustomDfCreatorWrapper
- CustomDfCreator
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
Visibility
- Public
- All
Instance Constructors
- new CustomDfCreatorWrapper(fnExec: fnExecType, fnSchema: fnSchemaType)
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
exec(session: SparkSession, config: Map[String, String]): DataFrame
This method creates a DataFrame based on custom code.
This method creates a DataFrame based on custom code.
- session
the Spark session
- config
the input config of the associated action
- returns
the custom DataFrame
- Definition Classes
- CustomDfCreatorWrapper → CustomDfCreator
- val fnExec: fnExecType
- val fnSchema: fnSchemaType
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
schema(session: SparkSession, config: Map[String, String]): Option[StructType]
Adds the possibility to return a custom schema during init If no schema returned, init needs to call exec to get the schema, leading exec to be called twice
Adds the possibility to return a custom schema during init If no schema returned, init needs to call exec to get the schema, leading exec to be called twice
- session
the Spark Session
- config
the input config of the associated action
- returns
the schema of the custom DataFrame
- Definition Classes
- CustomDfCreatorWrapper → CustomDfCreator
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated