Interface SupportsOverwriteV2
- All Superinterfaces:
SupportsTruncate
,WriteBuilder
- All Known Subinterfaces:
SupportsOverwrite
Write builder trait for tables that support overwrite by filter.
Overwriting data by filter will delete any data that matches the filter and replace it with data that is committed in the write.
- Since:
- 3.4.0
-
Method Summary
Modifier and TypeMethodDescriptiondefault boolean
canOverwrite
(Predicate[] predicates) Checks whether it is possible to overwrite data from a data source table that matches filter expressions.Configures a write to replace data matching the filters with data committed in the write.default WriteBuilder
truncate()
Configures a write to replace all existing data with data committed in the write.Methods inherited from interface org.apache.spark.sql.connector.write.WriteBuilder
build, buildForBatch, buildForStreaming
-
Method Details
-
canOverwrite
Checks whether it is possible to overwrite data from a data source table that matches filter expressions.Rows should be overwritten from the data source iff all of the filter expressions match. That is, the expressions must be interpreted as a set of filters that are ANDed together.
- Parameters:
predicates
- V2 filter expressions, used to match data to overwrite- Returns:
- true if the delete operation can be performed
- Since:
- 3.4.0
-
overwrite
Configures a write to replace data matching the filters with data committed in the write.Rows must be deleted from the data source if and only if all of the filters match. That is, filters must be interpreted as ANDed together.
- Parameters:
predicates
- filters used to match data to overwrite- Returns:
- this write builder for method chaining
-
truncate
Description copied from interface:SupportsTruncate
Configures a write to replace all existing data with data committed in the write.- Specified by:
truncate
in interfaceSupportsTruncate
- Returns:
- this write builder for method chaining
-