Skip to content
This repository was archived by the owner on Feb 27, 2025. It is now read-only.
This repository was archived by the owner on Feb 27, 2025. It is now read-only.

Writing into Azure SQL DB doesn't work for popular field types #83

@radekhag

Description

@radekhag

Hello,

The connector on supported Spark 2.4.5 clusters throws metadata-types-related errors on certain scenarios during bulk insert tasks:
scenarios.zip

  1. On appending and overwriting data in existing SQL tables with popular field types like NVARCHAR(4000) and DATETIME2(0). They seem to successfully append tables NVARCHAR(MAX) and DATETIME fields because java types from StringType and TimestampType are probably by default converted into NVARCHAR(MAX) and DATETIME.

image

  1. On saving combined with creating or overwriting whole SQL tables (by DROP and CREATE) by passing a schema of SQL table through an option ‘createTableColumnTypes’. A package throws errors when a passed schema contains fields of popular types like NVARCHAR(MAX), DATETIME, DATETIME2(0).

image

I’m sending an attachment containing snippets for both scenarios. The log4jerrors log points to a medata-types problem which happen despite using parameter ‘schemaCheckEnabled’ = false or true and referred to the scenario no. 1.
I would be grateful if you could help me. I think it is an issue of the package.

Regards,
Radoslaw

Metadata

Metadata

Assignees

No one assigned

    Labels

    duplicateThis issue or pull request already existsin progressThis issue is being looked at and in progress

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions