You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Feb 27, 2025. It is now read-only.
The connector on supported Spark 2.4.5 clusters throws metadata-types-related errors on certain scenarios during bulk insert tasks: scenarios.zip
On appending and overwriting data in existing SQL tables with popular field types like NVARCHAR(4000) and DATETIME2(0). They seem to successfully append tables NVARCHAR(MAX) and DATETIME fields because java types from StringType and TimestampType are probably by default converted into NVARCHAR(MAX) and DATETIME.
On saving combined with creating or overwriting whole SQL tables (by DROP and CREATE) by passing a schema of SQL table through an option ‘createTableColumnTypes’. A package throws errors when a passed schema contains fields of popular types like NVARCHAR(MAX), DATETIME, DATETIME2(0).
I’m sending an attachment containing snippets for both scenarios. The log4jerrors log points to a medata-types problem which happen despite using parameter ‘schemaCheckEnabled’ = false or true and referred to the scenario no. 1.
I would be grateful if you could help me. I think it is an issue of the package.