Skip to content

Add DBAPI executemany INSERT batching support #2048

Open
@jlynchMicron

Description

@jlynchMicron

Is your feature request related to a problem? Please describe.
It appears that the BigQuery DBAPI does not support multi-row INSERT batching for more performant python-based DML transactions. Current executemany INSERT statements are executed one at a time, leading to massive slowdowns in batch INSERT DML operations.

Example multi-row insert from BiqQuery documentation:
https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax#insert_examples

INSERT dataset.Inventory (product, quantity)
VALUES('top load washer', 10),
      ('front load washer', 20),
      ('dryer', 30),
      ('refrigerator', 10),
      ('microwave', 20),
      ('dishwasher', 30),
      ('oven', 5)

 
Describe the solution you'd like
Add multi-row INSERT batching support.
MySQL DBAPI example: https://github.com/PyMySQL/PyMySQL/blob/main/pymysql/cursors.py#L194

Describe alternatives you've considered
I will probably crudely make a patch to my sqlalchemy-bigquery DBAPI cursor to enable this support for my project that needs this performance boost for my ORM based application.

Additional context
sqlalchemy-bigquery related ticket: googleapis/python-bigquery-sqlalchemy#497
sqlalchemy related discussion: sqlalchemy/sqlalchemy#12038

Metadata

Metadata

Assignees

No one assigned

    Labels

    api: bigqueryIssues related to the googleapis/python-bigquery API.priority: p3Desirable enhancement or fix. May not be included in next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions