Description
Is your feature request related to a problem? Please describe.
It appears that the BigQuery DBAPI does not support multi-row INSERT batching for more performant python-based DML transactions. Current executemany INSERT statements are executed one at a time, leading to massive slowdowns in batch INSERT DML operations.
Example multi-row insert from BiqQuery documentation:
https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax#insert_examples
INSERT dataset.Inventory (product, quantity)
VALUES('top load washer', 10),
('front load washer', 20),
('dryer', 30),
('refrigerator', 10),
('microwave', 20),
('dishwasher', 30),
('oven', 5)
Describe the solution you'd like
Add multi-row INSERT batching support.
MySQL DBAPI example: https://github.com/PyMySQL/PyMySQL/blob/main/pymysql/cursors.py#L194
Describe alternatives you've considered
I will probably crudely make a patch to my sqlalchemy-bigquery DBAPI cursor to enable this support for my project that needs this performance boost for my ORM based application.
Additional context
sqlalchemy-bigquery related ticket: googleapis/python-bigquery-sqlalchemy#497
sqlalchemy related discussion: sqlalchemy/sqlalchemy#12038