BigQuery Stored Procedures for Apache Spark Preview Enrollment
BigQuery Stored Procedures for Apache Spark is a new feature being introduced in BigQuery. With this new stored procedure, you can create and run Apache Spark code that is written in Python directly from BigQuery. You can then run and schedule these stored procedures in BigQuery using a Google Standard SQL query, similar to running SQL stored procedures.
********************************************************************

* This is a preview feature. Please note the following limitations:  *

********************************************************************

1. This Preview feature is not ready to be used for production workloads according to the official TOS: https://cloud.google.com/products#section-22.

2. For the complete list of limitations, please refer to https://cloud.google.com/bigquery/docs/spark-procedures.

For any question about the Preview Enrollment, please contact us at bigspark-preview@google.com.

登入 Google 即可儲存進度。瞭解詳情
Email Address *
Organization Name *
Project Numbers *
Please specify the projects where you will create and execute the Spark stored procedures. For multiple project numbers, add a semi-colon separated list between project numbers. See http://console.cloud.google.com/home/dashboard to find your project number. Example: "1231231244114; 123123123123". Note, that if you enter an INCORRECT value, service will not be activated and you will be notified about failure by email you provided above.
Use Case *
How do you plan to use the feature?
Would you be open to providing us feedback about this feature in a product research session?  *
提交
清除表單
請勿利用 Google 表單送出密碼。
這份表單是在 Google.com 中建立。 隱私權與條款